`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`
`
`APPLE INC.,
`Petitioner
`
`v.
`
`IMMERSION CORPORATION
`Patent Owner
`
`U.S. Patent No. 8,659,571
`Filing Date: February 21, 2013
`Issue Date: February 25, 2014
`Title: Interactivity Model for Shared Feedback on Mobile Devices
`
`
`Case IPR2016-01372
`
`
`
`REPLY DECLARATION OF PATRICK BAUDISCH
`
`
`
`
`Mail Stop Patent Board
`Patent Trial and Appeal Board
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 1
`
`
`
`
`
`
`
`EXHIBIT LIST
`
`Exhibit No. Description
`
`1001
`
`U.S. Patent No. 8,659,571.
`
`1002
`
`Declaration of expert Dr. Patrick Baudisch (“Baudisch Decl.”).
`
`1003
`
`File history of U.S. Patent No. 8,659,571.
`
`1004
`
`U.S. Patent No. 5,734,373 to Rosenberg et al. (“Rosenberg ’373”).
`
`1005
`
`1006
`
`1007
`
`1008
`
`1009
`
`1010
`
`1011
`
`1012
`
`1013
`
`U.S. Patent Application No. 2010/0156818 to Burrough et al.
`(“Burrough”).
`U.S. Patent No. 6,429,846 to Rosenberg et al. (“Rosenberg ’846”).
`
`File history of U.S. Patent App. No. 13/472,698 (the “’698
`application”).
`
`Excerpts from Barron’s Dictionary of Mathematics Terms, 3rd ed.
`(2009).
`
`Excerpts from The American Heritage Dictionary of the English
`Language, 5th ed. (2011).
`
`Patent Owner Immersion’s disclosure of preliminary claim
`constructions (Jun. 3, 2016).
`
`Patent Owner Immersion’s claim chart regarding alleged
`infringement of the ’571 patent by certain Apple iPhone products
`(Exhibit 5 to Immersion’s supplemental response to Apple’s
`interrogatory no. 19 in the ITC investigation).
`
`Patent Owner Immersion’s second claim chart regarding alleged
`technical domestic industry for the ’571 patent (Exhibit 51 to
`Immersion’s ITC Complaint) .
`
`Affidavit of Mr. Robert Williams in Support of Motion for Pro Hac
`Vice Admission
`
`
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 2
`
`
`
`
`
`
`
`1014
`
`1015
`
`Reply Declaration of Dr. Patrick Baudisch (“Baudisch Reply
`Decl.”)
`Visell Deposition Tr.
`
`
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 3
`
`
`
`
`
`I.
`
`INTRODUCTION
`
`1.
`
`I have been retained by counsel for Apple Inc. as an expert witness in
`
`the above-captioned proceeding. I have been asked to provide my opinion about
`
`the patentability of claims 1-4, 6, 23-26 and 28 of U.S. Patent No. 8,659,571 (the
`
`“’571 patent”).
`
`2.
`
`I have been retained at my normal hourly rate of 600 per hour. No
`
`part of my compensation is dependent upon the outcome of the petition for Inter
`
`Partes Review or the specifics of my testimony.
`
`A. Background and Qualifications
`
`3. My background and qualifications were submitted in Exhibit 1002,
`
`including my resume, which was attached as Appendix A thereto.
`
`B.
`
`Information Considered
`
`4. My opinions are based on my years of education, research, and
`
`experience, as well as my study of relevant materials. In forming my opinions, I
`
`have considered the materials identified in this declaration, as well as Paper 7,
`
`Decision Granting Institution of Inter Partes Review (“ID”), and Immersion
`
`Corporation’s Patent Owner Response (“POR”), including exhibits submitted with
`
`same, in particular Exhibit 2009, Declaration of Yon Visell, Ph.D. in Support of
`
`Immersion Corporation’s Patent Owner Response.
`
`5.
`
`I may rely upon these materials and/or additional materials to respond
`
`to arguments raised by Patent Owner Immersion (“PO” or “Immersion”). I may
`
`
`
`1
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 4
`
`
`
`
`
`also consider additional documents and information in forming any necessary
`
`opinions, including documents that may have not yet been provided to me.
`
`6. My analysis of the materials produced in this proceeding is ongoing
`
`and I will continue to review any new material as it is provided. This declaration
`
`represents only those opinions I have formed to date. I reserve the right to revise,
`
`supplement, or amend my opinions stated herein based on new information and on
`
`my continuing analysis of the materials already provided.
`
`II. LEGAL STANDARDS
`
`7.
`
`The legal standards I apply are set forth in Exhibit 1002 and
`
`incorporated by reference herein.
`
`III. BURROUGH DISCLOSES THE LIMITATON “GENERATING A
`DYNAMIC INTERACTION PARAMETER USING A FIRST
`GESTURE SIGNAL AND SECOND GESTURE SIGNAL”
`
`A. Burrough’s signals S are “gesture signals.”
`
`8.
`
`PO’s primary argument is that the signals S generated during the
`
`course of Burrough’s zoom gesture are not “gesture signals” under the Board’s
`
`construction of that term, because these signals allegedly do not convey meaning
`
`or user intent. POR at 5-12. However, in my opinion, PO’s argument is premised
`
`on a misinterpretation of the Board’s construction. PO’s interpretation is contrary
`
`to the plain language of the construction, the Board’s analysis in arriving at its
`
`construction, and the ’571 patent specification.
`
`
`
`2
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 5
`
`
`
`
`
`1.
`
`PO Misinterprets the Board’s Construction of “gesture
`signal.”
`
`9.
`
`The Board construed the term “gesture signal” to mean “a signal
`
`indicating a movement of the body that conveys meaning or user intent.” Paper
`
`No. 7 (“ID”) at 12. PO interprets this construction as imposing two requirements
`
`on each recited gesture signal: (1) the signal must indicate a movement of the
`
`body, and (2) the signal must convey meaning or user intent. For example, PO
`
`argues that the recited dynamic interaction parameter must be generated “based on
`
`a first signal that conveys meaning or user intent and a separate second signal
`
`that conveys meaning or user intent.”) Id. (emphasis added). Similarly, PO
`
`contends that the signals S disclosed by Burrough are not gesture signals, because
`
`“individual senses of touch (such as S1 and S2) do not convey meaning or user
`
`intent...” POR at 9.
`
`10. PO’s interpretation is inconsistent with the plain language of the
`
`construction. In the Board’s construction, i.e. “a signal indicating a movement of
`
`the body that conveys meaning or user intent” (ID at 12), the clause “that conveys
`
`meaning or user intent” modifies the phrase “movement of the body.” Thus, a
`
`plain reading of the construction suggests that the movement of the body—not the
`
`signal—must convey meaning or user intent.
`
`11. PO’s interpretation is also inconsistent with the Board’s rationale in
`
`adopting this construction. Specifically, the Board explained that the ’571 patent
`
`
`
`3
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 6
`
`
`
`
`
`expressly defines a gesture as “any movement of the body that conveys meaning or
`
`user intent.” ID at 8 (quoting Ex. 1001 at 3:34-35). The Board further explained:
`
`As described in the Specification and indicated by the plain language
`
`of the claim term, a “gesture signal” is simply a signal indicating a
`
`“gesture.” See, e.g., id. at col. 10, ll. 36–43 (describing that multiple
`
`inputs in time from a finger being swiped across a touch screen
`
`indicate the positions of the contact point of the finger moving along
`
`the touch screen in a swipe gesture). In other words, a “gesture
`
`signal” is simply “a signal indicating a movement of the body that
`
`conveys meaning or user intent.”
`
`ID at 9 (emphases added). Accordingly, the Board’s construction requires that a
`
`gesture signal simply indicate a gesture, where a gesture is a movement of the body
`
`that conveys meaning or user intent. In other words, the movement of the body—
`
`not the signal—must convey meaning or user intent.
`
`12. PO’s interpretation is also inconsistent with the ’571 patent
`
`specification. The specification describes an embodiment in which multiple
`
`gesture signals are generated in response to a swipe gesture on a touch screen. ID
`
`at 9 (citing Ex. 1001 at 10:36-43). In this embodiment, the user swiping a finger
`
`across the touchscreen is a gesture, i.e. a movement of the body (swiping a finger
`
`across the touchscreen) that conveys meaning or user intent (scrolling among
`
`displayed photographs). Ex. 1001 at 10:36-39 (“Fig. 9B shows a screen view of a
`
`user gesture using a single index finger being swiped across the touch sensitive
`
`
`
`4
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 7
`
`
`
`
`
`display…”) (emphasis added). The ’571 patent teaches that multiple inputs from
`
`the finger are generated during the course of the swipe gesture, and that each of
`
`these inputs “may occur at a different time and may indicate a different two
`
`dimensional position of the contact point of the index finger with the touch
`
`sensitive display.” Ex. 1001 at 10:39-43. The ’571 patent continues:
`
`Based upon the one or more inputs from the one or more user
`
`gestures in FIG. 9B, a dynamic haptic effect is provided during the
`
`user gesture and continuously modified as determined by the
`
`interaction parameter.
`
`Ex. 1001 at 10:45-49. Thus, the ’571 patent teaches that multiple inputs from the
`
`index finger during a swipe gesture may include first and second gesture signals
`
`used to generate a dynamic interaction parameter. Id. The multiple inputs from
`
`the index finger are “gesture signals” under the Board’s construction, because they
`
`are signals that indicate a movement of the body (swiping a finger across the
`
`touchscreen) that conveys meaning or user intent (scrolling among displayed
`
`photographs). Yet, under PO’s interpretation of the Board’s construction, these
`
`signals presumably would not be “gesture signals.” See POR at 9 (arguing that “a
`
`single indication that a finger has contacted a screen at a particular location… is
`
`not an indication of intent”). This further supports my opinion that PO’s
`
`interpretation of the Board’s construction is incorrect.
`
`
`
`5
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 8
`
`
`
`
`
`2.
`
`Burrough’s signals S are “gesture signals” under the
`Board’s Construction.
`
`13. The signals S generated during the course of Burrough’s zoom gesture
`
`are “gesture signals” under a proper interpretation of the Board’s construction.
`
`Burrough discloses a multi-touch zoom gesture, in which an image can be zoomed
`
`in or out by moving two fingers apart or together, respectively. Ex. 1005 at
`
`[0080]; Figs. 11, 12A-12H. The zoom embodiment is illustrated in a series of
`
`Figures, including Figs. 12B and 12C, reproduced below:
`
`
`
`Id. at Figs. 12B, 12C; [0082]. As illustrated in these figures, a user can zoom in on
`
`the displayed map by moving two fingers apart. Ex. 1005 at [0080]. Likewise, a
`
`user can zoom out on a map by moving two fingers closer together. Id.
`
`14. The movement of the user’s fingers during this zoom interaction is a
`
`gesture as that term is defined in the ’571 patent specification, i.e. “a movement of
`
`the body that conveys meaning or user intent.” The movement of the user’s fingers
`
`
`
`6
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 9
`
`
`
`
`
`is clearly a movement of the body. And, the movement of the user’s fingers
`
`conveys the user’s intent to zoom in or zoom out on the displayed content.
`
`15. Burrough discloses that the user’s interaction with the touch screen
`
`during the zoom gesture is captured by one or more signals S generated by the
`
`touch screen sensors. Ex. 1015 at [0046], [0079]. At any given moment in time,
`
`the touch screen sensors generate at least one signal S for each finger. For
`
`example, in Fig. 12B, the touch screen sensor generates two signals (which Dr.
`
`Visell refers to as S1 and S2) associated with the position of each finger. Ex. 2009
`
`at ¶ 36. Similarly, after the user’s fingers have moved to the positions depicted in
`
`Fig. 12C, the touch screen sensor generates two additional signals (which Dr.
`
`Visell refers to as S3 and S4) associated with the new positions of each finger. Id.
`
`at ¶ 42.
`
`16.
`
`In my opinion, the signals S (e.g. S1, S2, S3 and S4) generated by the
`
`touch screen sensor are gesture signals under a proper interpretation of the Board’s
`
`construction, because these signals indicate a movement of the body (i.e., the
`
`movement of the user’s fingers) that conveys meaning or user intent (i.e., the intent
`
`to zoom in or zoom out).
`
`
`
`7
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 10
`
`
`
`
`
`3.
`
`Burrough’s signals S are “gesture signals” even under PO’s
`interpretation.
`
`17. A POSITA would appreciate that the zoom gesture described by
`
`Burrough is a complex gesture that comprises multiple simple gestures, including,
`
`for example, individual finger down gestures and finger motion gestures.
`
`18.
`
`In this regard, the ’571 patent expressly contemplates that complex
`
`gestures may be comprised of multiple simple gestures. Ex. 1001 at 3:35-56. For
`
`example, the ’571 patent discloses that “bringing a finger into contact with a touch
`
`sensitive surface may be referred to as a ‘finger on’ gesture, while removing a
`
`finger from a touch sensitive surface may be referred to as a separate ‘finger off’
`
`gesture.” Id. at 3:37-43. Further in this passage, the ’571 patent specification
`
`states “any number of… simple or complex gestures may be combined in any
`
`manner to form any number of other gestures…” Id. at 3:52-55.
`
`19. Burrough similarly contemplates that the multi-touch zoom gesture
`
`may comprise multiple simple gestures, such as finger down and finger move
`
`gestures. For example, Burrough discloses that “the set down of the fingers will
`
`associate or lock the fingers to a particular GUI object being displayed.” Ex. 1005
`
`at [0081]. Burrough further discloses that “when the fingers are moved apart, the
`
`zoom-in signal can be used to increase the size of the embedded features in the
`
`GUI object and when the fingers are pinched together, the zoom-out signal can be
`
`used to decrease the size of embedded features in the object.” Ex. 1005 at [0081].
`
`
`
`8
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 11
`
`
`
`
`
`20. The signals S generated during the course of Burroughs’ zoom gesture
`
`are gesture signals even under PO’s interpretation of the Board’s construction,
`
`because these signals indicate (1) a movement of the body and (2) convey meaning
`
`or user intent of simple gestures that comprise the zoom gesture. For example, Fig.
`
`12B depicts a user bringing two fingers into contact with the touch screen. When a
`
`user brings two fingers into contact with the touch screen as illustrated in Fig. 12B,
`
`at least two signals (e.g. S1 and S2) are generated. Signals S1 and S2 are each
`
`“gesture signals” under PO’s interpretation of the Board’s construction, because
`
`each signal (1) indicates a movement of the user’s body (i.e. bringing a finger into
`
`contact with the touchscreen) and (2) conveys meaning or user intent (i.e. to
`
`contact the touchscreen in a particular position). Similarly, Fig. 12C depicts a user
`
`moving two fingers to new positions. When a user moves two fingers to new
`
`positions as illustrated in Fig. 12C, at least two signals (e.g. S3 and S4) are
`
`generated. Signals S3 and S4 are each “gesture signals” under PO’s interpretation
`
`of the Board’s construction, because each signal (1) indicates a movement of the
`
`user’s body (i.e. moving the finger to a new location) and (2) conveys meaning or
`
`user intent (i.e. to zoom in or zoom out).
`
`4.
`
`Each of Burrough’s signals S is a “gesture signal” under the
`Board’s construction.
`
`21. PO argues that, in Burrough, the intent to zoom in or zoom out is
`
`based upon a calculation that requires multiple signals S generated at different
`
`
`
`9
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 12
`
`
`
`
`
`times, and that therefore, each individual signal S cannot convey meaning or user
`
`intent. POR at 9-14. For example, PO argues that “the user intent of zooming in
`
`or zooming out in Burrough cannot be determined by a single data point …
`
`provided by just one of the fingers – information from numerous signals S must be
`
`considered together.” Id. at 12. I disagree.
`
`22. First, PO’s argument is premised on an interpretation of the Board’s
`
`construction, which requires that the gesture signal (as opposed to a movement of
`
`the body) convey meaning or user intent. As discussed above, in my opinion, PO
`
`misinterprets the Board’s construction.
`
`23. Second, PO’s argument incorrectly assumes that each gesture signal
`
`must itself convey meaning or user intent. As the Board noted, its construction of
`
`gesture signal “does not exclude conveying meaning and user intent in conjunction
`
`with other gesture signals.” ID at 26-27. This conclusion well supported by the
`
`teachings of the ’571 patent. For example, as discussed above, the ’571 patent
`
`discloses generating multiple gesture signals during the course of a user’s swipe
`
`gesture, each of which “may occur at a different time and may indicate a different
`
`two-dimensional position” of the user’s finger. Ex. 1001 at 10:39-43. While an
`
`individual signal indicating the position of the finger may not itself convey the user
`
`intent to scroll, in conjunction with other signals generated during the course of the
`
`gesture, each signal indicates a movement of the body that conveys the user’s
`
`
`
`10
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 13
`
`
`
`
`
`intent to scroll. Thus, the ’571 patent contemplates that individual gesture signals
`
`need not in and of themselves convey the full meaning or user intent of the gesture
`
`that they indicate. Rather, the gesture signals may convey a position or movement
`
`that may ultimately comprise a gesture or portion of a gesture.
`
`5.
`
`Burrough teaches that the multi-touch zoom gesture
`comprises two substantially simultaneously occurring
`gestures.
`
`24. Patent Owner argues that Burrough’s teaching of “at least two
`
`substantially simultaneously occurring gestures using at least two different fingers
`
`or other object[s]” involves “two separate gestures with two separate intents,” and
`
`does not describe multiple signals generated by two fingers in the zoom
`
`embodiment. POR at 15. In my opinion, PO’s argument mischaracterizes the
`
`teachings of Burrough.
`
`25. Burrough teaches that “one aspect of the invention describes a touch
`
`sensitive input device able to recognize at least two substantially simultaneously
`
`occurring gestures using at least two different fingers or other objects (hereinafter
`
`referred to as a multi-touch event).” Ex. 1005 at [0035] (emphasis added).
`
`Burrough’s description of the zoom embodiment mirrors this language and makes
`
`clear that the zoom gesture involves two substantially simultaneously occurring
`
`gestures: “In the described embodiment, the nature of the multi-touch event can
`
`be determined based upon either the presence of at least two fingers indicating that
`
`
`
`11
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 14
`
`
`
`
`
`the touch is gestural (i.e., multi-touch) rather than a tracking touch based on one
`
`finger and/or by the pressure asserted by the fingers on the surface 126.” Id.
`
`(emphasis added). Moreover, Burrough teaches that in the zoom gesture
`
`embodiment, “the vibro-tactile response provided to each finger can have the same
`
`profile or different profiles. For example, if it the pressure applied by one finger is
`
`substantially greater than that applied by the other finger, then the vibro -tactile
`
`response for the two fingers can be different due to the varying pressure applied by
`
`each finger.” Id. at [0079]. In other words, Burrough contemplates that the
`
`gestures performed by each finger may result in different haptic responses for each
`
`finger. In view of these teachings, a POSITA would appreciate that the zoom
`
`embodiment involves two substantially simultaneously occurring gestures using
`
`two different fingers, each of which potentially resulting in a different haptic
`
`response.
`
`B.
`
`PO’s Arguments Regarding Tinfo Mischaracterize My Initial
`Declaration.
`
`26. PO contends that the “only” signals identified in the Petition and my
`
`initial declaration identified as the claimed “gesture signals” in Burrough are the
`
`signal(s) S generated by sensing device 124. POR at 16-20. However, PO’s
`
`argument misrepresents my declaration.
`
`27. While my declaration certainly identifies the signals S as “gesture
`
`signals,” my identification of the recited gesture signal is not limited to only these
`
`
`
`12
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 15
`
`
`
`
`
`signals. Rather, my declaration indicates that Burrough discloses a touch screen
`
`arranged to receive different types of touch events. Ex. 1002 at ¶ 57 (citing Ex.
`
`1005 at [1006]). These touch events may be used to implement a wide variety of
`
`gestures, including a zoom gesture. Id. at ¶¶ 58-59 (citing Ex. 1005 at [0017],
`
`[0079], Figs. 11, 12A-H). I explained that Burrough discloses that in response to a
`
`touch event T, “sensing device 124 generates touch signal S1 (and any other
`
`signal consistent with a multi-touch event),” which I identified as gesture signals.
`
`Id. at ¶ 60. In the context of dependent claim 2, I explained that signals generated
`
`in response to a touch event T may include signals representing the motion of the
`
`finger ( T/ x, T/ y). Id. at ¶¶ 77-78 (citing Ex. 1005 at [0051]). And, in my
`
`declaration, I identified these signals as gesture signals. Id.
`
`28. Burrough discloses that signals representing the motion of the user’s
`
`fingers may be included in a Tinfo signal. For example, Burrough teaches that the
`
`signal(s) S generated by the touch screen sensor may be “converted” into Tinfo
`
`signals, which “can include location, direction, speed and acceleration information
`
`of touch event T.” Ex. 1005 at [0046]. As I explained at my deposition, signals S
`
`and Tinfo may be different representations of the same gesture signal originating
`
`from the touch screen. A POSITA would understand from Burrough’s teachings
`
`that Tinfo is a repackaged digital representation of the signals S generated by the
`
`touch screen. A POSITA would appreciate that Tinfo signals must necessarily
`
`
`
`13
`
`
`
`
`
`originate from the touch screen signals S, because sensing device 124 is the
`
`specific hardware element disclosed for sensing a user’s touch. Accordingly, I
`
`disagree with PO’s contention that my identification of Tinfo signals as the recited
`
`“gesture signals” is somehow “new.” POR at 15-20.
`
`IV. BURROUGH DISCLOSES THE “DYNAMIC INTERACTION
`PARAMETER” LIMITATION
`
`29. As I explained in Ex. 1002, Burrough discloses a dynamic interaction
`
`parameter, haptic response H(d), whose magnitude varies as a function of the
`
`distance between the user’s fingers during the course of a zoom gesture. Ex. 1002
`
`at ¶¶ 63-70. PO argues that haptic response H(d) is “neither dynamic nor
`
`generated,” because the function represented by H(d) is stored in memory. POR at
`
`20-24. PO’s argument, however, confuses the function that defines H(d), (also
`
`referred to as the “haptic profile”), with the output of that function (the “haptic
`
`response”), which changes dynamically depending on the gesture signals it relies
`
`upon. Ex. 1005 at [0082], [0051].
`
`30.
`
`In the context of Burrough’s zoom embodiment, the haptic profile
`
`used for each finger is a linear function of the distance between the user’s fingers.
`
`Ex. 1005 at [0082]. Figures 12B-H illustrate the relationship between distance d
`
`and the haptic response in a graphical format. For example, in Figure 12D
`
`(reproduced below), two graphs in the lower right corner show the linear
`
`
`
`14
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 17
`
`
`
`
`
`relationship between distance d and haptic response output by H(d) for each finger
`
`touching the screen:
`
`
`
`Ex. 1005, Fig. 12D; [0082].
`
`31. As the distance changes, the magnitude of the haptic response H(d)
`
`changes as a linear function of the distance. For example, if the distance between
`
`the fingers increases, the magnitude of the haptic effect likewise increases. Ex.
`
`1005 at [0082]. Similarly, if the distance between the fingers decreases, the
`
`magnitude of the haptic effect likewise decreases. Id. Accordingly, while the
`
`linear function defining the haptic profile may remain static, the output of this
`
`function, i.e. haptic response H(d), is a parameter that changes over time based on
`
`the user’s interaction with the device. Thus, haptic response H(d) satisfies the
`
`Board’s construction of “dynamic interaction parameter, i.e. “a parameter that
`
`changes over time or reacts in real time based on a user’s interaction with a
`
`device.” ID at 13.
`
`
`
`15
`
`APPLE INC. - IPR2016-01372
`Ex. 1014 - Page 18
`
`
`
`
`
`32. Burrough also contemplates dynamically changing the haptic profile
`
`during the course of a zoom gesture (“as the zoom factor increases, the haptic
`
`profile H(d) can change by, for example, the slope becoming more steep as the
`
`resolution of the underlying map increases”). Ex. 1005 at [0082]. Thus, contrary
`
`to PO’s argument (POR at 23), a POSITA would understand that even the haptic
`
`profiles associated with zoom embodiment may change in response to the user’s
`
`interaction.
`
`V. BURROUGH DISCLOSES THE ADDITIONAL LIMITATION OF
`CLAIM 2
`
`33. PO contends that Burrough’s touch sensor signals S do not comprise
`
`vector signals, because each signal S indicates “only that a finger has passed a
`
`particular sensor at a particular moment in time.” POR at 26. As such, PO argues
`
`that the momentary location of a finger cannot contain a magnitude and direction,
`
`and thus cannot be a vector signal. Id.
`
`34. However, PO ignores Burrough’s teachings that the haptic response H
`
`“can vary depending upon ... the location on surface 126 of touch event T (i.e.,
`
`T(x)) [or] any finger motion ( T/ x, T/ y).” Ex. 1005 at [0051]. A POSITA
`
`would appreciate that as the user’s fingers move from one position to another in
`
`the context of Burrough’s zoom embodiment (e.g. from Fig. 12B to Fig. 12C), the
`
`haptic response may be generated based upon the relative movement of the user’s
`
`fingers ( T/ x, T/ y). A POSITA would further appreciate that the motion of
`
`
`
`16
`
`
`
`
`
`the user’s fingers on the touch screen must necessarily be detected by the touch
`
`screen sensors 124, as this is the particular hardware component disclosed by
`
`Burrough for sensing the motion of a user’s fingers on the touch screen.
`
`35. Signals representing finger motion ( T/ x, T/ y) are vector
`
`signals, having both a magnitude (i.e. the difference between the current and
`
`previous position) and a direction (i.e. in the positive or negative direction in the x
`
`and/or y axis).
`
`VI. CONCLUSION
`
`36.
`
`I declare that all statements made herein of my own knowledge are
`
`true and that all statements made on information and belief are believed to be true,
`
`and further that these statements were made with the knowledge that willful false
`
`statements and the like so made are punishable by fine or imprisonment, or both,
`
`under Section 1101 of Title 18 of the United States Code.
`
`
`
`Executed on August 4, 2017 in Germany.
`
`
`
`
`
`
`
`
`
`
`
`
`
`Dr. Patrick Markus Baudisch
`
`
`
`
`
`17
`
`