`UNITED STATES PATENT AND TRADEMARK OFFICE
`____________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`____________
`
`
`
`APPLE INC.,
`Petitioner,
`
`v.
`
`IMMERSION CORPORATION,
`Patent Owner.
`___________________
`
`Case IPR2016-01372
`Patent No. 8,659,571
`________________
`
`
`IMMERSION CORPORATION'S
`
`PATENT OWNER RESPONSE
`
`
`
`
`
`
`
`
`
`
`Mail Stop “PATENT BOARD”
`Patent Trial and Appeal Board
`U.S. Patent and Trademark Office
`P.O. Box 1450
`Alexandria, VA 22313-1450
`
`10180235
`
`
`
`
`
`
`
`
`TABLE OF CONTENTS
`
`Page
`
`I.
`
`II.
`
`INTRODUCTION ........................................................................................ 1
`
`THE INVENTION OF THE '571 PATENT ................................................ 2
`
`III. GROUND 1: CLAIMS 1-4, 6, 23-26 AND 28 ARE NOT OBVIOUS
`OVER BURROUGH .................................................................................... 5
`
`A.
`
`Burrough Does Not Disclose Claims 1 and 23 Because
`Two Gesture Signals Are Not Used To Form A Single
`Dynamic Interaction Parameter .......................................................... 5
`
`1.
`
`2.
`
`Burrough does not teach generating a dynamic interaction
`parameter using a first gesture signal and a second gesture
`signal ........................................................................................ 5
`
`Dr. Baudisch’s new argument that multiple Tinfo signals could
`constitute the claimed gesture signals is inaccurate .............. 16
`
`B.
`
`C.
`
`D.
`
`Burrough Does Not Disclose Claim 1 Because It Does
`Not Teach “Generating” a "Dynamic Interaction
`Parameter" ........................................................................................ 20
`
`Burrough Does Not Render Obvious Claim 1 Because
`There Is No Evidence a POSITA Would Have Modified
`Burrough ........................................................................................... 24
`
`Burrough Does Not Disclose Or Render Obvious Claim 2
`Because the Supposed “Gesture Signals” of Claim 1 Do
`Not Include Magnitude And Direction ............................................ 25
`
`IV. CONCLUSION ........................................................................................... 29
`
`10180235
`
`
`- ii -
`
`
`
`
`
`
`
`
`
`Cases
`
`TABLE OF AUTHORITIES
`
` Page(s)
`
`KSR Int’l Co. v. Teleflex Inc.,
`550 U.S. 398 (2007) ............................................................................................ 24
`
`Personal Web Technologies, LLC v. Apple,
`Slip Op. 16-1174 ................................................................................................. 24
`
`
`
`10180235
`
`
`- iii -
`
`
`
`
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`
`EXHIBIT LIST
`
`
`
`Immersion
`Ex. 2001
`
`Immersion
`Ex. 2002
`
`Immersion
`Ex. 2003
`
`Immersion
`Ex. 2004
`
`Immersion
`Ex. 2005
`
`Immersion
`Ex. 2006
`
`Immersion
`Ex. 2007
`Immersion
`Ex. 2008
`
`Immersion
`Ex. 2009
`Immersion
`Ex. 2010
`
`
`
`
`
`
`Declaration of Yon Visell, Ph.D. in Support of Immersion
`Corporation’s Patent Owner Preliminary Response, dated
`October 13, 2016
`
`Dictionary.com, http://www.dictionary.com/browse/gesture?s=t
`(last visited Oct. 11, 2016)
`
`August 2, 2012 Applicant Remarks in Prosecution of U.S. Patent
`No. 8,279,193
`
`U.S. Patent Application Publication No. US 2007/0279392
`(Rosenberg ’392)
`
`July 19, 2012 Non-Final Rejection in Prosecution of U.S. Patent
`No. 8,279,193
`
`May 16, 2012 Original Claims in Prosecution of U.S. Patent No.
`8,279,193
`
`Curriculum Vitae of Yon Visell, Ph.D.
`
`Oct. 11, 2016 Joint Proposed Claim Construction Chart submitted
`by Apple, Immersion, and OUII Staff in ITC Investigation Nos.
`337-TA-990 and 337-TA-1004
`
`Declaration of Yon Visell, Ph.D., dated May 31, 2017
`
`Deposition Transcript of Dr. Patrick M. Baudisch
`
`
`
`- iv -
`
`
`
`
`
`
`
`I.
`
`INTRODUCTION
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`Patent Owner Immersion Corporation (“Immersion” or “Patent Owner”)
`
`submits this Response to the Board’s Decision – Institution of Inter Partes Review
`
`(Paper 7) (“Decision”), entered January 11, 2017 to institute Inter Partes Review
`
`of United States Patent No. 8,659,571 (“the '571 patent”) filed by Apple, Inc.
`
`(“Petitioner”).
`
`The ’571 claims recite applying a drive signal to a haptic output device
`
`according to a dynamic interaction parameter. The dynamic interaction parameter
`
`is generated using a first gesture signal and a second gesture signal. The Board
`
`determined that there was a reasonable likelihood that claims 1-4, 6, 23-26 and 28
`
`of the ’571 patent were obvious over Burrough.
`
`Burrough discloses a system where signals S are generated by a sensing
`
`device. Each signal S merely represents a position of a finger (e.g. x/y position) at
`
`a moment in time. These S signals are what Petitioner points to as the first and
`
`second gesture signals. However, this mapping is at odds with the Board’s
`
`construction of “gesture signal” as “a signal indicating a movement of the body
`
`that conveys meaning or user intent.” See Institution Decision at 12. In particular,
`
`each signal S does not indicate a movement of the body that conveys meaning or
`
`user intent, because an x/y position, standing alone in Burrough’s system, does not
`
`
`
`
`
`- 1 -
`
`
`
`
`include sufficient information from which to discern a meaning or user intent.
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`Indeed, Petitioner’s expert admitted that a user action such as touching the screen
`
`with two fingers (thus generating two of the signals S Petitioner points to as the
`
`gesture signals) does not constitute a gesture unless the intent can be clearly
`
`discerned. Ex. 2010 at 50:20-24 (explaining that moving a cursor may not be a
`
`gesture because "[i]t's not clear what the intent is"). Because the signals S that
`
`Petitioner points to do not indicate a movement of the body that conveys meaning
`
`or user intent in Burrough’s system, they cannot constitute the claimed gesture
`
`signals. Ex. 2009 at ¶¶ 34-47.
`
`Furthermore, Burrough does not disclose "generating" the dynamic
`
`interaction parameter based on the statements of Petitioner’s own expert.
`
`Petitioner points to the haptic profile H(d) as the supposed dynamic interaction
`
`parameter, but Burrough clearly teaches that H(d) is stored in memory and
`
`retrieved from a database, and thus not “generated.” Ex. 2009 at ¶¶ 53-60.
`
`II. THE INVENTION OF THE '571 PATENT
`
`The ‘571 patent is directed to a novel way of producing haptic effects in
`
`electronic devices. Ex. 2009 at ¶¶ 29-32 As noted in the Background of the ‘571
`
`patent “[t]raditional architectures that provide haptic feedback only with triggered
`
`10180235
`
`
`
`- 2 -
`
`
`
`
`effects are available, and must be carefully designed to make sure the timing of the
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`haptic feedback is correlated to user initiated gestures or system animations.
`
`However, because these user gestures and system animations have variable timing,
`
`the correlation to haptic feedback may be static and inconsistent and therefore less
`
`compelling to the user.” Ex. 1001 at 1:49-56; Ex. 2009 at ¶¶ 31-32. The ‘571
`
`patent discloses a dynamic haptic effect that is created using an interaction
`
`parameter. Ex. 1001, 10:22-26. The interaction parameter is generated with
`
`gesture signals. Ex. 1001, 15:3-7. A drive signal is then applied to a haptic output
`
`device in accordance with the interaction parameter. Ex. 1001, 15:8-9, Fig. 14. By
`
`way of example, the interaction parameter can be synthesized with one of the
`
`methods shown in Table 2 reproduced below.
`
`10180235
`
`
`
`- 3 -
`
`
`
`
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`The interaction parameter is dynamic which provides the following
`
`advantages:
`
`
`
`The effect of providing or modifying a dynamic haptic effect in real-
`time during and even after a user gesture is that no two gestures such
`as page turns or finger swipes will feel the same to the user. That is,
`the dynamic haptic effect will always be unique to the user gesture,
`thereby creating a greater sense [of] connectedness to the device and
`more compelling user interface experience for the user as compared to
`a simple static haptic effect provided by a trigger event.
`
`Ex. 1001, 10:62-11:3.
`
`Claim 1 recites:
`
`10180235
`
`
`
`- 4 -
`
`
`
`
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`1.
`
`A method for producing a haptic effect comprising:
`
`receiving a first gesture signal;
`
`receiving a second gesture signal;
`
`generating a dynamic interaction parameter using the first gesture signal and
`
`the second gesture signal; and
`
`applying a drive signal to a haptic output device according to the dynamic
`
`interaction parameter.
`
`III. GROUND 1: CLAIMS 1-4, 6, 23-26 AND 28 ARE NOT OBVIOUS
`OVER BURROUGH
`
`A. Burrough Does Not Disclose Claims 1 and 23 Because Two
`Gesture Signals Are Not Used To Form A Single Dynamic
`Interaction Parameter
`
`1.
`
`Burrough does not teach generating a dynamic interaction
`parameter using a first gesture signal and a second gesture signal
`
`Claim 1 requires “generating a dynamic interaction parameter using the first
`
`gesture signal and the second gesture signal”—in other words, the dynamic
`
`interaction parameter must be generated using both “a first gesture signal” and “a
`
`second gesture signal.” The dynamic interaction parameter is then used to provide
`
`a haptic output. See Ex. 1001 at Claim 1 (“applying a drive signal to a haptic
`
`output device according to the dynamic interaction parameter”).
`
`10180235
`
`
`
`- 5 -
`
`
`
`The Board construed the term gesture signal as "a signal indicating a
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`
`movement of the body that conveys meaning or user intent." Decision at 12.
`
`Inserting this construction for gesture signal into the claims, the dynamic
`
`interaction parameter must be generated with a first signal indicating a movement
`
`of the body that conveys meaning or user intent and a separate second signal
`
`indicating a movement of the body that conveys meaning or user intent. That is, a
`
`single haptic output must be based on a first signal that conveys meaning or user
`
`intent and a separate second signal that conveys meaning or a user intent.
`
`Petitioner points to “signal(s) S,” which are “signals representing each touch
`
`on the touch screen” (Paper 7 at 25) as the claimed first and second gesture signals.
`
`In particular, Burrough teaches:
`
`In response to the pressure applied by the user during touch event T,
`sensing device 124 generates touch signal S1 (and any other signal
`consistent with a multi-touch event). Touch signal S1 can be
`monitored by an electronic interface (not shown) and passed to
`processor 106. Processor 106, in turn, can convert the number,
`combination and frequency of the signal(s) S into Touch information
`Tinfo that can include location, direction, speed and acceleration
`information of touch event T.
`
`10180235
`
`
`
`- 6 -
`
`
`
`
`Ex. 1005 at ¶ 46. This portion of Burrough teaches that for each touch in a multi-
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`touch event, a signal S is generated. For instance, in a single-finger touch event,
`
`signal S1 would be generated at a moment in time to reflect the one-finger touch.
`
`Ex. 2009 at ¶ 36. For a two-finger touch event, both signals S1 and S2 would be
`
`generated at a moment in time to reflect the two-finger touch. Ex. 2009 at ¶ 36.
`
`This is confirmed by Burrough’s disclosure that sensing device 14 produces “an
`
`electrical signal . . . each time a finger (or other appropriate object) passes a
`
`sensor.” Ex. 1005 at ¶ 42. In other words, each signal (such as signals S1, S2,
`
`etc.) is a representation of a finger passing a sensor at a given moment in time. Ex.
`
`2009 at ¶ 36. When these signals are considered collectively, information such as
`
`speed and direction can be determined. Ex. 2009 at ¶ 36; Ex. 1005 at ¶ 42 (“the
`
`more signals, the more the user moved his or her finger”); Ex. 1005 at ¶ 46
`
`(“Processor 106 . . . can convert the number, combination and frequency of the
`
`signal(s) S into touch information Tinfo that can include location, direction, speed
`
`and acceleration information of touch event T.”). But taken in isolation, each
`
`signal S1 and S2 merely indicates that a finger has passed a sensor on sensing
`
`device 124 at a particular moment in time. See Ex. 1005 ¶¶ 42, 46; Ex. 2009 at
`
`¶ 36.
`
`10180235
`
`
`
`- 7 -
`
`
`
`The “signal(s) S” such as S1, S2, etc., representing each touch at a moment
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`
`in time as applied to the zoom embodiment of Burrough are what Petitioner maps
`
`to the gesture signals in Petitioner’s claim 1 analysis. For instance, Petitioner
`
`specifically equates the two signals S1 and S2 resulting from sensing device 124
`
`and representing two different touches as a first gesture signal and second gesture
`
`signal respectively. Pet. at 15-16 (quoting Ex. 1005 at explaining that “touch
`
`signal S1” is a “gesture signal”); id. at 16 (explaining that in “a multi-touch zoom
`
`gesture,” “sensing device 124 generates signals representing each touch on the
`
`touchscreen,” and that “a POSITA would understand that the sensing device
`
`generates a first gesture signal representing one of the two fingers on the touch
`
`screen, and a second gesture signal representing the other finger on the
`
`touchscreen”). Petitioner’s expert, Dr. Baudisch, confirms that the only signals he
`
`points to as the “first gesture signal” and “second gesture signal” come from
`
`sensing device 124. Ex. 2010 at 16:9-12 (agreeing that “the gesture signals then
`
`are created by sensing device 124”).
`
`Petitioner’s contention that signals S1 and S2 in a multi-touch zoom gesture
`
`are the claimed “first gesture signal” and “second gesture signal” is unfounded,
`
`because neither of the S1 or S2 signals is a “signal indicating a movement of the
`
`body that conveys meaning or user intent.” Each of S1 and S2 is merely an
`
`10180235
`
`
`
`- 8 -
`
`
`
`
`indication that a user object (such as a finger) has come into contact with a sensor
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`at a particular moment in time. Ex. 2009 at ¶ 38.
`
`Petitioner’s expert admits that a single indication that a finger has contacted
`
`a screen at a particular location (such as that provided by S1 or S2) is not an
`
`indication of intent in Burrough’s zoom gesture embodiment. See Ex. 2010 at
`
`43:17-44:15 (explaining that intent is only determined once the distance between
`
`two fingers can be understood as increasing or decreasing). The fact that
`
`individual senses of touch (such as S1 and S2) do not convey meaning or user
`
`intent is confirmed by Figure 11 of Burrough. Figure 11, reproduced below, is a
`
`flow-chart “diagram of a zoom gesture method” (Ex. 1005 at ¶ 79)—the same
`
`embodiment that Petitioner relies upon for obviousness. See, e.g., Pet. at 16.
`
`10180235
`
`
`
`- 9 -
`
`
`
`
`
`Fig. 11
`
`Case IPR2016-01372
`
`Case IPR2016-01372
`Patent No. 8,659,571
`Patent No. 8,659,571
`
`111210
`
`the presence efet
`least a first finger
`and a second finger
`
`1102
`
`an: deleeled
`
`
`I: e presence affine
`twn fingers. represents
`a gestm'e
`
`
`
`HEM-
`
`ltsptit: devices nearest the
`touch point an: set to eclin
`made in antler to provide a
`vibrataetile
`
`distanee between at least the
`
`hm fingers is emnpered
`
`Distance increasing?
`
`
`
`1114
`
`
`
`111$
`
`
`
`Generate mum in signal
`
`Generate 2mm trut signal
`
`
`
`
`Generate mum in haptie
`signal
`
`Generate zoom out haptie
`signal
`
`
`
`10180235
`
`10180235
`
`
`
`- 10 -
`
`-10-
`
`
`
`Figure 11 shows that the process does not begin until the presence of a first
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`
`finger and the presence of a second finger is detected. Such a detection would
`
`generate at least two signals (e.g., an S1 signal and an S2 signal). Ex. 2009 at ¶ 40;
`
`Ex. 2010 at 36:10-13 (agreeing that “when two fingers touch a screen, . . . there
`
`must be more than one S1 signal being produced”). Accordingly, for Burrough's
`
`system to even begin the decision flow to determine whether a zoom in or zoom
`
`out gesture may occur, at least two signals S must be generated by sensing device
`
`124. Ex. 2009 at ¶ 40.
`
`After the presence of two fingers is detected (as a result of two separate
`
`signals S1 and S2 detected simultaneously), the distance between the two fingers is
`
`compared in step 1108. This step likewise requires a comparison between two S
`
`signals—the position associated with a first signal S1 can be compared with the
`
`position of a second signal S2 to calculate a distance. Ex. 2009 at ¶ 41; see Ex.
`
`1005 at ¶ 42 (“an electrical signal is produced each time a finger (or other
`
`appropriate object) passes a sensor”).
`
`Then, in step 1110, the process determines whether the distance between
`
`fingers is increasing or decreasing. Because movement for each finger is
`
`represented by multiple signals S, the determination of whether distance is
`
`increasing or decreasing would require knowing even more signals S. Ex. 2009 at
`
`10180235
`
`
`
`- 11 -
`
`
`
`
`¶ 42; see also Ex. 1005 at ¶ 42 (explaining that multiple signals need to be
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`examined to determine the distance a user moved a single finger—“the more
`
`signals, the more the user moved his or her finger”). For example, if S1 and S2
`
`represent a first distance at a first moment in time, and S3 and S4 represent a
`
`second distance at a second moment in time, the system can compare the two
`
`distance values to determine whether the distance is increasing or decreasing.
`
`This step 1110, which determines whether the distance is increasing or
`
`decreasing, makes the determination regarding user intent, because it determines
`
`whether a zoom in or a zoom out signal should be generated. Ex. 2009 at ¶ 43;
`
`Fig. 11 (showing step 1110 branching between two options—“Generate zoom in
`
`signal” 1112 and “Generate zoom out signal” 1114). The user intent of zooming in
`
`or zooming out in Burrough cannot be determined by a single data point (such as
`
`S1 or S2) provided by just one of the fingers—information from numerous signals
`
`S must be considered together. Accordingly, one signal S1 (and even two signals
`
`S1 and S2) cannot indicate a movement of a body that conveys meaning or user
`
`intent in Burrough's zoom gesture. Ex. 2009 at ¶ 43.
`
`Petitioner does not explain how each individual signal, such as S1 or S2,
`
`could supposedly constitute a gesture signal in the zoom gesture embodiment of
`
`Burrough. Indeed, the fact that each individual S signal does not indicate a
`
`10180235
`
`
`
`- 12 -
`
`
`
`
`movement of the body that conveys meaning or user intent is further confirmed by
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`Petitioner’s own expert, Dr. Baudisch. He agreed that in Burrough’s zoom gesture
`
`embodiment, no conclusion about user intent to zoom in or zoom out can be made
`
`until step 1110 in Figure 11, which considers the position of at least two fingers
`
`over some time period:
`
`But the main decision in figure 11 seems to be shown in figure—in
`1110, where it’s actually checking if the distance is increasing or
`decreasing. That certainly is a point at which the system, you know,
`seems to draw conclusions about user intent, which is whether to
`zoom in or zoom out.
`
`Ex. 2010 at 43:17-44:15. Accordingly, based upon Dr. Baudisch's own testimony,
`
`even looking at two S signals simultaneously is insufficient to indicate a movement
`
`of the body that conveys meaning or user intent in Burrough’s zoom embodiment.
`
`Rather, a greater number of S signals must be examined over time before that
`
`intent can be determined. Ex. 2009 at ¶ 44.
`
`Furthermore, Dr. Baudisch admitted that there are instances in Burrough
`
`where placing two fingers on the screen simultaneously conveys an entirely
`
`different intent than an intent to zoom. For instance, Burrough teaches that “a first
`
`object can be dragged with one finger while a second object can be dragged with
`
`another finger.” Ex. 1005 at ¶ 45. Dr. Baudisch admits that these are two
`- 13 -
`
`10180235
`
`
`
`
`
`
`simultaneously occurring gestures. Ex. 2010 at 52:14-22. Accordingly, even
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`looking at two individual S1 and S2 signals, indicating that there are two fingers
`
`touching the screen, is insufficient to determine that any sort of zoom will be
`
`initiated—other gestures, such as dragging objects across the screen, could also be
`
`possible. Ex. 2009 at ¶ 45.
`
`Because each S signal cannot individually indicate a movement of the body
`
`that conveys a meaning or user intent in Burrough's zoom gesture embodiment, it
`
`is clear that each S signal is not a "gesture signal" as the term was construed by the
`
`Board. Petitioner may argue, however, that each signal S still conveys some
`
`meaning or user intent, even if it does not necessarily convey an intent to zoom.
`
`This argument, however, is contradicted by Dr. Baudisch, who testified that even a
`
`movement of a finger across a screen (which would generate numerous signals S
`
`over time) might not indicate a movement of the body that conveys meaning or
`
`user intent. For instance, he stated that “moving a cursor with [a] finger” may not
`
`be a gesture because “[i]t’s not clear what the intent is.” Ex. 2010 at 50:18-51:2.
`
`Accordingly, Petitioner’s own expert admits that meaning or user intent must be
`
`discernable for a gesture to occur, and that not every signal S (or even a series of
`
`signals S over time) allows meaning or user intent to be discerned. Petitioner has
`
`not identified any meaning or intent that would be conveyed by a single signal S in
`
`10180235
`
`
`
`- 14 -
`
`
`
`
`Burrough's zoom embodiment, and a POSITA would not understand these signals
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`to individually convey meaning or user intent. Ex. 2009 at ¶ 46.
`
`Finally, the portion of Burrough which the Board relied upon in the
`
`Institution Decision does not contradict the understanding that the signal(s) S are
`
`not the claimed gesture signals. In particular, the Board correctly recognized that
`
`Burrough teaches that it is capable of recognizing "at least two substantially
`
`simultaneously occurring gestures using at least two different fingers or other
`
`object[s]." Decision at 25. This passage, however, does not support Petitioner’s
`
`contention that each signal representing a touch in Burrough’s zoom embodiment
`
`is a separate gesture signal. Rather, it simply indicates that, in the event the user
`
`performs two separate gestures with two separate intents simultaneously, Burrough
`
`can recognize those gestures. Ex. 2009 at ¶ 47. For example, Burrough teaches
`
`that “a first object can be dragged with one finger while a second object can be
`
`dragged with another finger.” Ex. 1005 at ¶ 45. As Petitioner’s own expert
`
`admits, these are two simultaneously occurring gestures. Ex. 2010 at 52:14-22;
`
`Ex. 2009 at ¶ 47. Accordingly, the portion of Burrough that teaches that two
`
`separate gestures can occur simultaneously does not indicate that Burrough’s zoom
`
`embodiment is comprised of two separate gestures, and certainly does not indicate
`
`10180235
`
`
`
`- 15 -
`
`
`
`
`that each particular S signal such as S1 or S2 indicates a movement of the body
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`that conveys meaning or user intent.
`
`2.
`
`Dr. Baudisch’s new argument that multiple Tinfo signals could
`constitute the claimed gesture signals is inaccurate
`
`In the Petition and in the declaration of Dr. Baudisch, Petitioner pointed to only
`
`the “signal(s) S” generated by sensing device 124 as constituting the first and
`
`second gesture signals. Pet. 15-16; Ex. 1002 ¶¶ 57-62. As explained above, these
`
`signals indicate only that a user’s finger has passed a sensor at a particular moment
`
`in time, and do not indicate a movement of the body that conveys meaning or user
`
`intent in Burrough’s zoom embodiment. Ex. 2009 at ¶ 48.
`
`After being questioned about the signals S generated by sensing device 124, it
`
`became clear that Dr. Baudisch sought to change his position with respect to what
`
`constitutes Burrough’s gesture signals. During his deposition, Dr. Baudisch was
`
`instructed to read from Paragraph 46 of Burrough, which recites in relevant part:
`
`In the simplest case, a touch event T is initiated each time an
`
`object, such as a user's finger, is placed on upper surface 126 over, or
`in close proximity to, sensing region 128. Pressure generated by touch
`event T is transmitted through protective layer 120 at sensing region
`128 to sensing device 124. In response to the pressure applied by the
`user during touch event T, sensing device 124 generates touch signal
`S1 (and any other signal consistent with a multi-touch event). Touch
`- 16 -
`
`10180235
`
`
`
`
`
`
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`signal S1 can be monitored by an electronic interface (not shown) and
`passed to processor 106. Processor 106, in turn, can convert the
`number, combination and frequency of the signal(s) S into touch
`information Tinfo that can include location, direction, speed and
`acceleration information of touch event T. Processor 106 can then
`pass touch information Tinfo to micro-controller 132.
`
`Ex. 2010 at 16:4-25; Ex. 1005 at ¶ 46.
`
`After reading this portion of Burrough, Dr. Baudisch testified that, contrary
`
`to his report, signal(s) S were not the only signals he would like to identify as the
`
`first and second gesture signals. He stated that a “Tinfo” signal generated by
`
`processor 106 could merely just be a repackaging of one of the S signals (such as
`
`S1 or S2), such that Tinfo could also constitute a gesture signal. Ex. 2010 at
`
`28:23-29:9.
`
`By raising the new argument that Tinfo is merely a “repackaged” S1 signal,
`
`Dr. Baudisch presumably seeks to imply that signal(s) S could also contain
`
`information concerning direction, speed, and acceleration. In particular, Burrough
`
`teaches that Tinfo “can include location, direction, speed and acceleration
`
`information” (Ex. 1005 at ¶ 46), and thus equating Tinfo with signals such as S1
`
`and S2 would imply that S1 and S2 also can include direction, speed, and
`
`acceleration. See, e.g., Ex. 2010 at 27:2-12 (implying that signals originating from
`
`10180235
`
`
`
`- 17 -
`
`
`
`
`sensing device 124 could include a variety of information because “the inventor
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`leaves this largely open what exactly comes out of that sensor”). Petitioner may
`
`intend to argue that if S1 is equivalent to Tinfo, S1 would provide sufficient
`
`information to indicate a movement of the body that conveys meaning or user
`
`intent.
`
`To the extent Petitioner seeks to raise the argument that Tinfo signals are
`
`effectively the same as the signal(s) S, it should not be considered because it was
`
`not properly raised in the Petition. Furthermore, the argument is not supported by
`
`any evidence in Burrough. Ex. 2009 at ¶ 51. Burrough specifically teaches that
`
`processor 105 “can convert the number, combination and frequency of the signal(s)
`
`S into touch information Tinfo.” Ex. 1005 at ¶ 46. This plainly shows that Tinfo
`
`includes information in the aggregate about a series of signal(s) S, because the
`
`“number,” “combination,” and “frequency” of these signal(s) S must be taken into
`
`account for Tinfo to be calculated. Id.; Ex. 2009 at ¶ 51. Thus, even if Petitioner
`
`is allowed to raise the belated argument that “Tinfo” could be effectively the same
`
`as S1 or S2, which Petitioner identifies as the first and second gesture signals, the
`
`argument is contradicted by the express teachings of Burrough. Notably,
`
`paragraph 46 of Burrough specifically teaches that the signal(s) S must collectively
`
`processed to generate “touch information Tinfo.” Burrough clearly teaches, for
`
`10180235
`
`
`
`- 18 -
`
`
`
`
`example, that Tinfo is determined by looking at “the number, combination and
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`frequency of the signal(s) S.” Ex. 1005 at ¶ 46.
`
`Furthermore, Burrough does not teach or suggest that, as Dr. Baudisch
`
`speculated in his deposition, that signals S originating from sensing device 124 can
`
`include information such as direction, speed, and acceleration. See Ex. 1010 at
`
`27:2-24 (equivocating as to what information is included in the signals generated
`
`from sensing device 124). To the contrary, Burrough clearly teaches that sensing
`
`device 124 produces an electrical signal "each time a finger (or other appropriate
`
`object) passes a sensor," and then the signal can be considered in aggregate to
`
`"indicate location, direction, speed and acceleration of the finger." Ex. 1005 at
`
`¶ 42. Burrough further teaches that each signal individually does not convey speed
`
`or magnitude of movement—rather, the number and frequency of signals must be
`
`considered to obtain that information. Ex. 1005 at ¶ 42 ("the more signals, the
`
`more the user moved his or her finger"); id. at ¶ 46 ("Processor 106, in turn, can
`
`convert the number, combination and frequency of the signal(s) S into touch
`
`information Tinfo that can include location, direction, speed and acceleration
`
`information of touch event T."); Ex. 2009 at ¶ 52. To the extent that Petitioner
`
`attempts to argue in its reply that each signal is capable of conveying meaning or
`
`user intent because it conveys more than a single position of a finger at a moment
`
`10180235
`
`
`
`- 19 -
`
`
`
`
`in time, the argument should be rejected as unsupported by the Petition and
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`contrary to Burrough's teachings.
`
`B.
`
`Burrough Does Not Disclose Claim 1 Because It Does Not Teach
`“Generating” a "Dynamic Interaction Parameter"
`
`Claim 1 of the ‘571 patent requires “generating a dynamic interaction
`
`parameter using the first gesture signal and the second gesture signal.” As
`
`explained above, Petitioner cannot establish that the signals identified in the
`
`Petition are “gesture signals,” and thus cannot establish a dynamic interaction
`
`parameter derived from first and second gesture signals.
`
`This limitation is not taught by Burrough for the additional reason that the
`
`supposed “dynamic interaction parameter” identified by Petitioner is not
`
`“generated” or a "dynamic interaction parameter" as required by the ‘571 patent.
`
`The ’571 patent provides the following examples of ways to generate the
`
`interaction parameter.
`
`10180235
`
`
`
`- 20 -
`
`
`
`
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`
`A POSITA would understand that, based on the teachings of the '571 patent, the
`
`dynamic interaction parameter must be calculated using the two gesture signals.
`
`Ex. 2009 at ¶ 55. For example, the '571 patent teaches that "any type of input
`
`synthesis method may be used to generate the interaction parameter," including the
`
`examples provided in Table 2 (copied above). Ex. 1001 at 15:3-7. Each of the
`
`examples in Table 2 involves a calculation, not merely a retrieval of a
`
`predetermined value from memory. Ex. 2009 at ¶ 55.
`
`Petitioner identifies H(d) as the supposed "dynamic interaction parameter"
`
`that it contends Burrough's system "generates." See Pet. at 19 ("The haptic
`
`response H(d) is a 'dynamic interaction parameter.'"). The Board instituted trial on
`
`10180235
`
`
`
`- 21 -
`
`
`
`
`Petitioner's representation that H(d) was the supposed dynamic interaction
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`parameter that is generated in Burrough's system. Institution Decision at 27-28.
`
`However, H(d) is neither "dynamic" nor "generated" in Burrough's system.
`
`First, Burrough does not "generate" haptic output H(d) in the manner that
`
`the "dynamic interaction parameter" of the '571 patent is generated. Ex. 2009 at
`
`¶ 57. As explained in Burrough, the function H(d) is merely a pre-determined (i.e.,
`
`not dynamic) haptic profile pulled directly from a database in memory. For
`
`example, Burrough teaches that haptic profiles H (such as H(d)) are predefined,
`
`static values that are stored in memory:
`
`One of the advantages of the invention lies in the fact that the
`relationship between a touch event or a class of touch events and
`corresponding haptic response can be dynamic in nature. By dynamic
`it is meant that although specific haptic profiles H stored in haptic
`profile data base 134 remain static, the haptic profile (or profiles)
`used to respond to a particular event T can be varied based upon any
`number of factors.
`
`Ex. 1005 at ¶ 51. A POSITA would