`Patent No. 8,659,571
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`___________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`___________________
`
`
`APPLE INC.,
`Petitioner,
`
`v.
`
`IMMERSION CORPORATION,
`Patent Owner.
`___________________
`
`Case IPR2016-01372
`Patent No. 8,659,571
`___________________
`
`
`
`
`DECLARATION OF YON VISELL, PH.D.
`
`IN SUPPORT OF IMMERSION CORPORATION’S
`
`PATENT OWNER RESPONSE
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Immersion Ex 2009-1
`Apple v Immersion
`IPR2016-01372
`
`
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`TABLE OF CONTENTS
`
`I.
`
`II.
`
`INTRODUCTION ........................................................................................ 1
`
`SUMMARY OF OPINIONS ........................................................................ 1
`
`III. QUALIFICATIONS AND EXPERIENCE .................................................. 2
`
`IV. LEVEL OF ORDINARY SKILL IN THE ART .......................................... 5
`
`V.
`
`LEGAL PRINCIPLES .................................................................................. 6
`
`A.
`
`B.
`
`C.
`
`Claim Construction............................................................................. 6
`
`Anticipation ........................................................................................ 7
`
`Obviousness ........................................................................................ 8
`
`VI. THE ’571 PATENT ...................................................................................... 8
`
`VII. GROUND 1: BURROUGH DOES NOT RENDER CLAIMS 1-4,
`6, 23-26 AND 28 OBVIOUS UNDER 35 U.S.C. § 103(a) ....................... 11
`
`A.
`
`Burrough does not disclose or render obvious claim 1
`because two gesture signals are not used to form a single
`dynamic interaction parameter ......................................................... 12
`
`1.
`
`2.
`
`Burrough does not teach generating a dynamic
`interaction parameter using a first gesture signal and a
`second gesture signal ............................................................. 12
`
`Dr. Baudisch’s argument that multiple Tinfo signals
`could constitute the claimed gesture signals is
`inaccurate ............................................................................... 21
`
`Burrough does not disclose or render obvious claim 1
`because it does not teach “generating” a “dynamic interaction
`parameter” ........................................................................................ 24
`
`Burrough Does Not Render Obvious Claim 1 Because There
`Is No Evidence a POSITA Would Have Modified Burrough .......... 28
`
`B.
`
`C.
`
`9698602
`
`
`
`- i -
`
`
`
`Immersion Ex 2009-2
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`
`D.
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`Burrough Does Not Disclose Or Render Obvious Claim 2
`Because the Supposed “Gesture Signals” of Claim 1 Do Not
`Include Magnitude And Direction.................................................... 29
`
`VIII. CONCLUSION ........................................................................................... 32
`
`
`
`
`
`
`
`
`
`- ii -
`
`
`
`Immersion Ex 2009-3
`Apple v Immersion
`IPR2016-01372
`
`
`
`1.
`
`I, Yon Visell, declare as follows:
`
`I.
`
`INTRODUCTION
`
`Case IPR2016-01372
`Patent No. 8,659,571
`
`
`2.
`
`I have been engaged by Immersion Corporation (“Immersion”) as an
`
`expert in connection with matters raised in the Petition for Inter Partes Review
`
`(“Petition”) of U.S. Patent No. 8,659,571 (the “’571 patent”) filed by Apple Inc.
`
`(“Apple” or “Petitioner”).
`
`3.
`
`This declaration is based on the information currently available to me.
`
`To the extent that additional information becomes available, I reserve the right to
`
`continue my investigation and study, which may include a review of documents
`
`and information that may be produced, as well as testimony from depositions that
`
`have not yet been taken.
`
`II.
`
`SUMMARY OF OPINIONS
`
`4.
`
`The ’571 patent is entitled “Interactivity Model for Shared Feedback
`
`on Mobile Devices.” The ’571 patent is directed to a novel way of producing
`
`haptic effects in electronic devices. The fundamental insight that is described and
`
`claimed in the ’571 patent is that the user’s gesture interactions with the device
`
`need to be tracked and analyzed in order to properly synchronize haptic feedback
`
`with a user’s input. Reflecting this focus, the claims specify that both a first and a
`
`second gesture signal (each based on a user’s gestural inputs) are used to generate
`
`something called a “dynamic interaction parameter.”
`
`9698602
`
`
`
`- 1 -
`
`
`
`Immersion Ex 2009-4
`Apple v Immersion
`IPR2016-01372
`
`
`
`5.
`
`The Board instituted trial on Petitioner’s Ground 1, concerning claims
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`
`1-4, 6, 23-26, and 28 of the ‘571 patent. Institution Decision at 45. Petitioner’s
`
`Ground 1 challenges these claims as obvious under pre-AIA 35 U.S.C. § 103(a) in
`
`light of U.S. Patent Pub. No. 2010-0156818 to Burrough et al. (“Burrough”), Ex.
`
`1005. Based on studying the petition and the exhibits cited in the petition as well
`
`as other documents, it is my opinion that claims 1-4, 6, 23-26, and 28 of the ‘571
`
`patent are not rendered obvious by Burrough.
`
`III. QUALIFICATIONS AND EXPERIENCE
`I obtained my Ph.D. degree in Electrical and Computer Engineering
`6.
`
`from McGill University in 2011. Before that, I received my MA in Physics from
`
`the University of Texas at Austin in 1999, and my BA in Physics from Wesleyan
`
`University in 1995.
`
`7.
`
`Since 2015, I have worked as an Assistant Professor at UCSB. From
`
`2013 to 2015, I worked as an Assistant Professor in the Department of Electrical
`
`and Computer Engineering at Drexel University.
`
`8.
`
`At UCSB, I lead the RE Touch Lab as its Director and Principal
`
`Investigator. The RE Touch Lab includes six Ph.D. students and numerous
`
`affiliated researchers and undergraduate students. Some of the topics that my
`
`teams at the RE Touch Lab have explored include computational perception, such
`
`
`
`
`
`- 2 -
`
`
`
`Immersion Ex 2009-5
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`as how the mechanical signatures of contact elicit conscious perception of touch,
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`and the creation of novel haptic devices for simulating the feel of touched objects.
`
`9. My personal research focuses on haptic engineering, robotics, and the
`
`mechanics and neuroscience of touch. My work is motivated by creative
`
`applications in haptic human-computer interaction, sensorimotor augmentation,
`
`and interaction in virtual reality.
`
`10.
`
`In addition to my research at the RE Touch Lab, I also teach classes,
`
`including linear and nonlinear control systems, haptics, human-computer
`
`interaction, interactive arts, artificial intelligence, and robotics.
`
`11.
`
`I am the author of over 60 articles in journals and conference
`
`proceedings. I hold one issued patent, U.S. Patent No. 9,041,521 (“Floor-Based
`
`Haptic Communication System”), and one pending patent application (“Stretchable
`
`Tactile Sensing Array”), both pertaining to haptic technology. I am the editor of
`
`two books on virtual reality, including Human Walking in Virtual Reality. I have
`
`received several awards and honors, including the Google Faculty Research Award
`
`in 2016, and several best paper awards at haptics symposia. I have chaired and
`
`edited several conferences and symposia.
`
`12.
`
`I also have experience working in industry. Before receiving my
`
`Ph.D., I worked for several years as the Principal DSP developer, audio at Ableton,
`
`a renowned music software company. Before that I worked for several years as a
`
`
`
`
`
`- 3 -
`
`
`
`Immersion Ex 2009-6
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`Research Scientist investigating speech recognition at Loquendo Inc., which is
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`now part of Nuance.
`
`13. My curriculum vitae is attached as Exhibit 2007.
`
`14.
`
`I am being compensated by Immersion for my time spent in
`
`developing this declaration at a rate of $400 per hour, and for any time spent
`
`testifying in connection with this declaration at a rate of $500 per hour. My
`
`compensation is not contingent upon the substance of my opinions, the content of
`
`this declaration or any testimony I may provide, or the outcome of the inter partes
`
`review or any other proceeding.
`
`15.
`
`I have no financial interest in Immersion, and have financial interests
`
`of less than $3000 in Apple through long-term mutual fund investments
`
`representing less than 1% of my portfolio.
`
`16. My opinions expressed in this declaration are based on the petition
`
`and exhibits cited in the petition, and other documents and materials identified in
`
`this declaration, including the ’571 patent and its prosecution history, the prior art
`
`references and materials discussed in this declaration, and any other references
`
`specifically identified in this declaration.
`
`17.
`
`I am aware of information generally available to, and relied upon by,
`
`persons of ordinary skill in the art at the relevant times, including technical
`
`
`
`
`
`- 4 -
`
`
`
`Immersion Ex 2009-7
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`dictionaries and technical reference materials (including, for example, textbooks,
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`manuals, technical papers, articles, and relevant technical standards).
`
`18.
`
`I reserve the right to supplement my opinions to address any
`
`information obtained, or positions taken, based on any new information that comes
`
`to light throughout this proceeding.
`
`IV. LEVEL OF ORDINARY SKILL IN THE ART
`It is my understanding that the ’571 patent should be interpreted based
`19.
`
`on how it would be read by a person of ordinary skill in the art at the time of the
`
`effective filing date of the application. It is my understanding that factors such as
`
`the education level of those working in the field, the sophistication of the
`
`technology, the types of problems encountered in the art, the prior art solutions to
`
`those problems, and the speed at which innovations are made may help establish
`
`the level of skill in the art.
`
`20.
`
`I am familiar with the technology at issue and the state of the art at the
`
`earliest priority date of the ’571 patent.
`
`21.
`
`It is my opinion, based upon a review of the ’571 patent, its file
`
`history, and my knowledge of the field of the art, a person of ordinary skill in the
`
`art for the field of the ’571 patent would have at least: (1) a Bachelor's of Science
`
`degree in an engineering discipline such as Mechanical Engineering or Computer
`
`Science, or (2) at least two years' experience working with human machine
`
`
`
`
`
`- 5 -
`
`
`
`Immersion Ex 2009-8
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`interface systems, graphical user interfaces, haptic feedback systems, robotics,
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`biomechanics, or mobile devices or equivalent embedded systems. A person of
`
`ordinary skill in the art would also have experience in haptic response technology
`
`in multi-touch or multi-gesture systems. This level of skill is commensurate with
`
`the interdisciplinary nature of the ’571 patent, which combines knowledge of
`
`computer software and user interface design with knowledge of electrical and/or
`
`mechanical systems for producing haptic effects.
`
`22.
`
`I have considered the issues discussed in the remainder of this
`
`declaration from this perspective of a person of ordinary skill in the art. Although
`
`I use this perspective, I do not believe that any of my opinions would change if a
`
`slightly higher or lower level of skill were assumed.
`
`V. LEGAL PRINCIPLES
`A. Claim Construction
`I am not a patent attorney and my opinions are limited to what I
`23.
`
`believe a person of ordinary skill in the art would have understood, based on the
`
`patent documents. I use the principles below, however, as a guide in formulating
`
`my opinions.
`
`24. My understanding is that a primary step in determining validity of
`
`patent claims is to properly construe the claims to determine claim scope and
`
`meaning.
`
`
`
`
`
`- 6 -
`
`
`
`Immersion Ex 2009-9
`Apple v Immersion
`IPR2016-01372
`
`
`
`25.
`
`In an inter partes review proceeding, as I understand from Immersion
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`
`counsel, claims are to be given their broadest reasonable construction (“BRC”) in
`
`light of the patent’s specification. 37 C.F.R. § 42.100(b). In other forums, such as
`
`in federal courts, different standards of proof and claim interpretation control,
`
`which are not applied by the patent office for inter partes review. Accordingly, I
`
`reserve the right to argue for a different interpretation or construction of the
`
`challenged claims in other proceedings, as appropriate.
`
`26.
`
`It is my understanding that in determining whether a patent claim is
`
`anticipated or obvious in view of the prior art, the patent office must construe the
`
`claim by giving the claim its broadest reasonable construction consistent with the
`
`specification. For the purposes of this review, I have construed each claim term in
`
`accordance with its plain and ordinary meaning under the required broadest
`
`reasonable construction.
`
`B. Anticipation
`It is my understanding that a claim is anticipated under 35 U.S.C. §
`27.
`
`102 if each and every element and limitation of the claim is found either expressly
`
`or inherently in a single prior art reference. I understand that anticipation is a
`
`question of fact. I further understand that the requirement of strict identity
`
`between the claim and the reference is not met if a single element or limitation
`
`required by the claim is missing from the applied reference.
`
`
`
`
`
`- 7 -
`
`
`
`Immersion Ex 2009-10
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`C. Obviousness
`It is my understanding that a claim is unpatentable under 35 U.S.C. §
`28.
`
`103 if the claimed subject matter as a whole would have been obvious to a person
`
`of ordinary skill in the art at the time of the alleged invention. I understand that the
`
`determination of obviousness is made with respect to the subject matter as a whole,
`
`not separate pieces of the claim. I understand that obviousness is a question of law
`
`based on underlying factual issues. I also understand that an obviousness analysis
`
`takes into account the scope and content of the prior art, the differences between
`
`the claimed subject matter and the prior art, the level of ordinary skill in the art at
`
`the time of the invention, and the existence of secondary consideration such as
`
`commercial success or long-felt but unresolved needs.
`
`VI. THE ’571 PATENT
`I have read and reviewed the ’571 patent and have an understanding
`29.
`
`of its background as well as its particular improvements over the prior art. I
`
`understand that the ’571 patent is entitled “Interactivity Model for Shared
`
`Feedback on Mobile Devices.” In my opinion, the ’571 patent is directed to a
`
`novel way of producing haptic effects in electronic devices. A person of ordinary
`
`skill in the art would recognize that a fundamental insight that is described and
`
`claimed in the ’571 patent is that the user’s gestural interactions with the device
`
`need to be tracked and analyzed in order to properly synchronize haptic feedback
`
`
`
`
`
`- 8 -
`
`
`
`Immersion Ex 2009-11
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`with a user’s input. Reflecting this focus, the claims specify that both a first and a
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`second gesture signal (each based on a user’s gestural inputs) are used to generate
`
`something called a “dynamic interaction parameter.” Ex. 1001 at claim 1
`
`(“receiving a first gesture signal; receiving a second gesture signal; generating a
`
`dynamic interaction parameter using the first gesture signal and the second gesture
`
`signal”). I understand that given the format of the claims, a single gesture signal is
`
`insufficient to form the dynamic interaction parameter.
`
`30.
`
`In my opinion, the dynamic interaction parameter is meant to
`
`accurately and responsively track the user’s behavior. As such, a person of
`
`ordinary skill in the art would understand that the dynamic interaction parameter
`
`changes or reacts in real time to the user’s interactions, and is used to alter the
`
`haptic effects produced by the device. This allows the device to provide
`
`responsive haptic feedback to the user. Ex. 1001 at 1:29-33 (“[V]ibrotactile haptic
`
`effects . . . may be useful in providing cues to users of electronic devices to alert
`
`the user to specific events, or provide realistic feedback to create greater sensory
`
`immersion within a simulated or virtual environment.”). A person of ordinary skill
`
`in the art would recognize that the approach of the ’571 patent is an improvement
`
`over the prior art because the ’571 patent’s techniques can improve the timing
`
`and/or nature of haptic feedback: “[B]ecause these user gestures and system
`
`animations have variable timing, the correlation to haptic feedback [in the prior art]
`
`
`
`
`
`- 9 -
`
`
`
`Immersion Ex 2009-12
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`may be static and inconsistent and therefore less compelling to the user.” Id. at
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`1:49-56.
`
`31. Other ingredients may be used in addition to a first gesture signal and
`
`a second gesture signal to generate the dynamic interaction parameter. For
`
`example, additional device sensor signals may be used. Id. at claim 7. A person of
`
`ordinary skill in the art would recognize that using these additional ingredients is
`
`another improvement over the prior art. E.g., id. at 1:56-60 (“Further, device
`
`sensor information is typically not used in combination with gestures to produce
`
`haptic feedback.”). The various ingredients may be combined and processed in
`
`several different ways to generate the dynamic interaction parameter. See, e.g., id.
`
`at Table 2 (listing 14 different example “methods of synthesis” that may be
`
`employed). In my opinion, the dependent claims of the ’571 patent show that the
`
`generation of the dynamic interaction parameter using both a first gesture signal
`
`and a second gesture signal, including the selection and processing of the
`
`ingredients, is the inventive focus. A person of ordinary skill in the art reading the
`
`patent would understand that the claims require specific ingredients in specific
`
`numbers to be used to generate the dynamic interaction parameter. E.g., id. at
`
`claim 7 (“receiving a first device sensor signal; receiving a second device sensor
`
`signal; and wherein generating a dynamic interaction parameter comprises
`
`generating a dynamic interaction parameter using the first gesture signal and the
`
`
`
`
`
`- 10 -
`
`
`
`Immersion Ex 2009-13
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`second gesture signal and the first device sensor signal and the second device
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`sensor signal”).
`
`32. Once the dynamic interaction parameter has been generated using a
`
`first gesture signal, a second gesture signal, and potentially other ingredients, it is
`
`used to modify the haptic output of the system. Id. at 15:8-9 (“At 1313, a drive
`
`signal is applied to a haptic actuator according to the interaction parameter.”); see
`
`also claim 1 (“applying a drive signal to a haptic output device according to the
`
`dynamic interaction parameter”). For example, in one embodiment, a user may
`
`scroll between different film frames on an electronic device with a touchscreen,
`
`and may receive haptic feedback for that interaction. Id. at 13:56-61 (“By using
`
`gestures or device sensor data, a user may scroll the filmstrip from left to right or
`
`right to left, and the filmstrip application may then dynamically provide a haptic
`
`effect for a first photograph 1101 which is different from a haptic effect for a
`
`second photograph 1103 based upon the gestures or device sensor data.”).
`
`VII. GROUND 1: BURROUGH DOES NOT RENDER CLAIMS 1-4, 6, 23-
`26 AND 28 OBVIOUS UNDER 35 U.S.C. § 103(A)
`
`33.
`
`It is my opinion that Apple has failed to establish that Burrough,
`
`renders these claims obvious for at least the reasons expressed below.
`
`
`
`
`
`- 11 -
`
`
`
`Immersion Ex 2009-14
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`A. Burrough does not disclose or render obvious claim 1 because two
`gesture signals are not used to form a single dynamic interaction
`parameter
`1.
`
`Burrough does not teach generating a dynamic interaction
`parameter using a first gesture signal and a second gesture
`signal
`
`34. Claim 1 requires “generating a dynamic interaction parameter using
`
`the first gesture signal and the second gesture signal”—in other words, the
`
`dynamic interaction parameter must be generated using both “a first gesture signal”
`
`and “a second gesture signal.” In this claim, the dynamic interaction parameter is
`
`then used to provide a haptic output. See Ex. 1001 at Claim 1 (“applying a drive
`
`signal to a haptic output device according to the dynamic interaction parameter”).
`
`35.
`
`I understand that the Board construed the term gesture signal as "a
`
`signal indicating a movement of the body that conveys meaning or user intent."
`
`Institution Decision at 12. Inserting this construction for gesture signal into the
`
`claims, the dynamic interaction parameter must be generated with a first signal
`
`indicating a movement of the body that conveys meaning or user intent and a
`
`separate second signal indicating a movement of the body that conveys meaning or
`
`user intent. That is, a single haptic output must be based on a first signal that
`
`conveys meaning or user intent and a separate second signal that conveys meaning
`
`or a user intent.
`
`
`
`
`
`- 12 -
`
`
`
`Immersion Ex 2009-15
`Apple v Immersion
`IPR2016-01372
`
`
`
`36.
`
`I understand that Petitioner points to “signal(s) S,” which are “signals
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`
`representing each touch on the touch screen” (Paper 7 at 25) as the claimed first
`
`and second gesture signals. In particular, Burrough teaches:
`
`In response to the pressure applied by the user during touch event T,
`sensing device 124 generates touch signal S1 (and any other signal
`consistent with a multi-touch event). Touch signal S1 can be
`monitored by an electronic interface (not shown) and passed to
`processor 106. Processor 106, in turn, can convert the number,
`combination and frequency of the signal(s) S into Touch information
`Tinfo that can include location, direction, speed and acceleration
`information of touch event T.
`Ex. 1005 at ¶ 46. This portion of Burrough teaches that for each touch in a multi-
`
`touch event, a signal S is generated. For instance, according to this teaching, in a
`
`single-finger touch event signal S1 would be generated at a single moment in time
`
`to reflect the one-finger touch. Likewise, for a two-finger touch event, a POSITA
`
`would understand that both signals S1 and S2 would be generated at a moment in
`
`time to reflect the two-finger touch. This understanding is confirmed by
`
`Burrough’s disclosure that sensing device 14 produces “an electrical signal . . .
`
`each time a finger (or other appropriate object) passes a sensor.” Ex. 1005 at ¶ 42.
`
`In other words, each signal (such as signals S1, S2, etc.) is a representation of a
`
`finger passing a sensor at a given moment in time. Burrough teaches that when
`
`these signals are considered collectively, information such as speed and direction
`
`
`
`
`
`- 13 -
`
`
`
`Immersion Ex 2009-16
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`can be determined. Ex. 1005 at ¶ 42 (“the more signals, the more the user moved
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`his or her finger”); Ex. 1005 at ¶ 46 (“Processor 106 . . . can convert the number,
`
`combination and frequency of the signal(s) S into touch information Tinfo that can
`
`include location, direction, speed and acceleration information of touch event T.”).
`
`But taken in isolation, each signal S1 and S2 merely indicates that a finger has
`
`passed a sensor on sensing device 124 at a particular moment in time. See Ex.
`
`1005 ¶¶ 42, 46.
`
`37.
`
`I understand that the “signal(s) S” such as S1, S2, etc., representing
`
`each touch at a moment in time as applied to the zoom embodiment of Burrough
`
`are what Petitioner maps to the gesture signals in Petitioner’s claim 1 analysis. For
`
`instance, Petitioner specifically equates the two signals S1 and S2 resulting from
`
`sensing device 124 and representing two different touches as a first gesture signal
`
`and second gesture signal respectively. Pet. at 15-16 (quoting Ex. 1005 at
`
`explaining that “touch signal S1” is a “gesture signal”); id. at 16 (explaining that in
`
`“a multi-touch zoom gesture,” “sensing device 124 generates signals representing
`
`each touch on the touchscreen,” and that “a POSITA would understand that the
`
`sensing device generates a first gesture signal representing one of the two fingers
`
`on the touch screen, and a second gesture signal representing the other finger on
`
`the touchscreen”). I further understand that Petitioner’s expert, Dr. Baudisch,
`
`confirmed at his deposition that the only signals he points to as the “first gesture
`
`
`
`
`
`- 14 -
`
`
`
`Immersion Ex 2009-17
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`signal” and “second gesture signal” come from sensing device 124. Ex. 2010 at
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`16:9-12 (agreeing that “the gesture signals then are created by sensing device
`
`124”).
`
`38.
`
`I disagree with Petitioner’s contention that signals S1 and S2 in a
`
`multi-touch zoom gesture are the claimed “first gesture signal” and “second
`
`gesture signal,” because in my opinion neither of the S1 or S2 signals is a “signal
`
`indicating a movement of the body that conveys meaning or user intent.” Rather, a
`
`POSITA would understand that each of S1 and S2 is merely an indication that a
`
`user object (such as a finger) has come into contact with a sensor at a particular
`
`moment in time.
`
`39.
`
`Indeed, I understand that Petitioner’s expert admits that a single
`
`indication that a finger has contacted a screen at a particular location (such as that
`
`provided by S1 or S2) is not an indication of intent in Burrough’s zoom gesture
`
`embodiment. See Ex. 2010 at 43:17-44:15 (explaining that intent is only
`
`determined once the distance between two fingers can be understood as increasing
`
`or decreasing). The fact that individual senses of touch (such as S1 and S2) do not
`
`convey meaning or user intent is confirmed by Figure 11 of Burrough. Figure 11,
`
`reproduced below, is a flow-chart “diagram of a zoom gesture method” (Ex. 1013
`
`at ¶ 79)—the same embodiment that Petitioner relies upon for obviousness.
`
`
`
`
`
`- 15 -
`
`
`
`Immersion Ex 2009-18
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`
`40. Figure 11 shows that the process does not begin until the presence of a
`
`first finger and the presence of a second finger is detected. A POSITA would
`
`
`
`
`
`- 16 -
`
`
`
`Immersion Ex 2009-19
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`understand that such a detection would generate at least two signals (e.g., an S1
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`signal and an S2 signal). See, e.g., Ex. 1005 at ¶ 46; Ex. 2010 at 36:10-13
`
`(agreeing that “when two fingers touch a screen, . . . there must be more than one
`
`S1 signal being produced”). Accordingly, a POSITA would understand that for
`
`Burrough's system to even begin the decision flow to determine whether a zoom in
`
`or zoom out gesture may occur, at least two signals S must be generated by sensing
`
`device 124.
`
`41. After the presence of two fingers is detected (as a result of two
`
`separate signals S1 and S2 detected simultaneously), the distance between the two
`
`fingers is compared in step 1108. A POSITA would recognize that step 1108
`
`requires a comparison between two S signals—for example, the position associated
`
`with a first signal S1 can be compared with the position of a second signal S2 to
`
`calculate a distance. See Ex. 1005 at ¶ 42 (“an electrical signal is produced each
`
`time a finger (or other appropriate object) passes a sensor”).
`
`42. Then, in step 1110, the process determines whether the distance
`
`between fingers is increasing or decreasing. Because movement for each finger is
`
`represented by multiple signals S, the determination of whether distance is
`
`increasing or decreasing would require knowing even more signals S. See Ex.
`
`1005 at ¶ 42 (explaining that multiple signals need to be examined to determine the
`
`distance a user moved a single finger—“the more signals, the more the user moved
`
`
`
`
`
`- 17 -
`
`
`
`Immersion Ex 2009-20
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`his or her finger”). For example, if S1 and S2 represent a first distance at a first
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`moment in time, and S3 and S4 represent a second distance at a second moment in
`
`time, the system would compare the two distance values to determine whether the
`
`distance is increasing or decreasing.
`
`43. This step 1110, which determines whether the distance is increasing
`
`or decreasing, makes the determination regarding user intent, because it is at that
`
`point that the system determines whether a zoom in or a zoom out signal should be
`
`generated. Fig. 11 (showing step 1110 branching between two options—“Generate
`
`zoom in signal” 1112 and “Generate zoom out signal” 1114). Accordingly, the
`
`user intent of zooming in or zooming out in Burrough cannot be determined by a
`
`single data point (such as S1 or S2) provided by just one of the fingers—
`
`information from numerous signals S must be considered together. Accordingly,
`
`one signal S1 (and even two signals S1 and S2) cannot indicate a movement of a
`
`body that conveys meaning or user intent in Burrough's zoom gesture.
`
`44.
`
`In my opinion, Petitioner does not explain how each individual signal,
`
`such as S1 or S2, could supposedly constitute a gesture signal in the zoom gesture
`
`embodiment of Burrough. Furthermore, I understand the testimony of Dr.
`
`Baudisch to confirm the understanding that each individual S signal does not
`
`indicate a movement of the body that conveys meaning or user intent. Dr.
`
`Baudisch agreed that in Burrough’s zoom gesture embodiment, no conclusion
`
`
`
`
`
`- 18 -
`
`
`
`Immersion Ex 2009-21
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`about user intent to zoom in or zoom out can be made until step 1110 in Figure 11,
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`which considers the position of at least two fingers over some time period:
`
`But the main decision in figure 11 seems to be shown in figure—in
`1110, where it’s actually checking if the distance is increasing or
`decreasing. That certainly is a point at which the system, you know,
`seems to draw conclusions about user intent, which is whether to
`zoom in or zoom out.
`Ex. 2010 at 43:17-44:15. I agree with this statement. In my opinion, even looking
`
`at two S signals simultaneously is insufficient to indicate a movement of the body
`
`that conveys meaning or user intent in Burrough’s zoom embodiment. Rather, a
`
`POSITA would understand that a greater number of S signals must be examined
`
`over time before that intent can be determined.
`
`45. Furthermore, Burrough teaches that placing two fingers on the screen
`
`simultaneously can convey an entirely different intent than an intent to zoom. For
`
`instance, Burrough teaches that “a first object can be dragged with one finger while
`
`a second object can be dragged with another finger.” Ex. 1005 at ¶ 45. Dr.
`
`Baudisch admits (and I agree) that these are two simultaneously occurring
`
`gestures. Ex. 2010 at 52:14-22. Accordingly, even the presence of two individual
`
`S1 and S2 signals, which indicate that there are two fingers touching the screen,
`
`would insufficient to determine that any sort of zoom will be initiated in
`
`
`
`
`
`- 19 -
`
`
`
`Immersion Ex 2009-22
`Apple v Immersion
`IPR2016-01372
`
`
`
`
`Burrough—other gestures, such as dragging objects across the screen, could also
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`be possible according to Burrough’s teaching and Dr. Baudisch’s testimony.
`
`46. Because each S signal cannot individually indicate a movement of the
`
`body that conveys a meaning or user intent in Burrough's zoom gesture
`
`embodiment, it is my opinion that each S signal is not a "gesture signal" as the
`
`term was construed by the Board. To the extent that Petitioner a