throbber
Case IPR2016-00896
`Patent No. 8,659,571
`
`
`UNITED STATES PATENT AND TRADEMARK OFFICE
`___________________
`
`BEFORE THE PATENT TRIAL AND APPEAL BOARD
`___________________
`
`
`
`APPLE INC.,
`Petitioner,
`
`v.
`
`IMMERSION CORPORATION,
`Patent Owner.
`___________________
`
`Case IPR2016-00896
`Patent No. 8,659,571
`___________________
`
`
`
`DECLARATION OF YON VISELL, PH.D.
`
`IN SUPPORT OF IMMERSION CORPORATION’S
`
`PATENT OWNER PRELIMINARY RESPONSE
`
`
`
`
`
`
`
`
`
`
`
`
`
`
`Immersion Ex 2007-1
`Apple v Immersion
`IPR2017-00896
`
`

`

`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`TABLE OF CONTENTS
`
`I. 
`
`II. 
`
`INTRODUCTION ........................................................................................ 1 
`
`SUMMARY OF OPINIONS ........................................................................ 1 
`
`III.  QUALIFICATIONS AND EXPERIENCE .................................................. 3 
`
`IV.  LEVEL OF ORDINARY SKILL IN THE ART .......................................... 6 
`
`V. 
`
`LEGAL PRINCIPLES .................................................................................. 7 
`
`A. 
`
`B. 
`
`C. 
`
`Claim Construction ............................................................................ 7 
`
`Anticipation ........................................................................................ 8 
`
`Obviousness ........................................................................................ 9 
`
`VI.  THE ’571 PATENT ...................................................................................... 9 
`
`VII.  CLAIM CONSTRUCTION ....................................................................... 12 
`
`A. 
`
`B. 
`
`“gesture signal” (claims 1-7, 23-29) ................................................ 12 
`
`“dynamic interaction parameter” (claims 1, 4-7, 12, 15-18,
`23, 26-29) ......................................................................................... 13 
`
`C. 
`
`“vector signal” (claims 2, 13, 24) ..................................................... 14 
`
`VIII.  GROUND 1: POUPYREV DOES NOT RENDER CLAIMS 1-4, 7,
`23-26 and 29 OBVIOUS UNDER 35 U.S.C. § 103(a) .............................. 14 
`
`A. 
`
`B. 
`
`C. 
`
`D. 
`
`E. 
`
`F. 
`
`Poupyrev does not disclose or render obvious claim 1 .................... 14 
`
`Poupyrev does not disclose or render obvious claim 2 .................... 22 
`
`Poupyrev does not disclose or render obvious claim 3 .................... 24 
`
`Poupyrev does not disclose or render obvious claim 4 .................... 24 
`
`Poupyrev does not disclose or render obvious claim 7 .................... 25 
`
`Poupyrev does not disclose or render obvious claims 23-26
`or 29 .................................................................................................. 26 
`
`9698602
`
`
`- i -
`
`
`
`Immersion Ex 2007-2
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`IX.  GROUNDS 2-3: POUPYREV IN VIEW OF OTHER
`REFERENCES DOES NOT RENDER CLAIMS 5-6 AND 27-28
`OBVIOUS UNDER 35 U.S.C. § 103(a) .................................................... 27 
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`X. 
`
`CONCLUSION ........................................................................................... 27 
`
`
`
`
`
`
`
`
`- ii -
`
`
`
`Immersion Ex 2007-3
`Apple v Immersion
`IPR2017-00896
`
`

`

`1.
`
`I, Yon Visell, declare as follows:
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`I.
`
`INTRODUCTION
`2.
`
`I have been engaged by Immersion Corporation (“Immersion”) as an
`
`expert in connection with matters raised in the Petition for Inter Partes Review
`
`(“Petition”) of U.S. Patent No. 8,659,571 (the “’571 patent”) filed by Apple Inc.
`
`(“Apple” or “Petitioner”).
`
`3.
`
`This declaration is based on the information currently available to me.
`
`To the extent that additional information becomes available, I reserve the right to
`
`continue my investigation and study, which may include a review of documents
`
`and information that may be produced, as well as testimony from depositions that
`
`have not yet been taken.
`
`II.
`
`SUMMARY OF OPINIONS
`4.
`
`The ’571 patent is entitled “Interactivity Model for Shared Feedback
`
`on Mobile Devices.” The ’571 patent is directed to a novel way of producing
`
`haptic effects in electronic devices. The fundamental insight that is described and
`
`claimed in the ’571 patent is that the user’s gesture interactions with the device
`
`need to be tracked and analyzed in order to properly synchronize haptic feedback
`
`with a user’s input. Reflecting this focus, the claims specify that both a first and a
`
`second gesture signal (each based on a user’s gestural inputs) are used to generate
`
`9698602
`
`
`- 1 -
`
`
`
`Immersion Ex 2007-4
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`something called a “dynamic interaction parameter.” The petition challenges
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`claims 1-7 and 23-29 of the ’571 patent.
`
`5.
`
`The petition raises three grounds, each based on obviousness under
`
`pre-AIA 35 U.S.C. § 103(a). Ground 1 argues that claims 1-4, 7, 23-26 and 29 of
`
`the ’571 patent are obvious in light of U.S. Patent No. 7,952,566 (“Poupyrev”), Ex.
`
`1013. Based on studying the petition and the exhibits cited in the petition as well
`
`as other documents, it is my opinion that claims 1-4, 7, 23-26 and 29 of the ‘571
`
`patent are not rendered obvious by Poupyrev.
`
`6.
`
`Ground 2 argues that claims 5 and 27 are obvious in light of Poupyrev
`
`and A FORCE FEEDBACK PROGRAMMING PRIMER by Louis Rosenberg (“Primer”),
`
`Ex. 1017. Based on studying the petition and the exhibits cited in the petition as
`
`well as other documents, it is my opinion that claims 5 and 27 are not rendered
`
`obvious by Poupyrev in view of Primer.
`
`7.
`
`Ground 3 argues that claims 6 and 28 are obvious in light of Poupyrev
`
`and Canadian Patent App. No 2,059,893 A1 (“Tecot”), Ex. 1015. Based on
`
`studying the petition and the exhibits cited in the petition as well as other
`
`documents, it is my opinion that claims 6 and 28 are not rendered obvious by
`
`Poupyrev in view of Tecot.
`
`8.
`
`Grounds 4 and 5 argue invalidity of claims 1-6 and 23-29 in view of
`
`U.S. Patent No. 5,734,373 (“Rosenberg ‘373,” Ex. 1004) alone, or in combination
`
`
`
`
`- 2 -
`
`
`
`Immersion Ex 2007-5
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`with U.S. Patent No. 6,429,846 (“Rosenberg ‘846,” Ex. 1006). I understand that
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`these grounds are duplicative with grounds that the Board has previously denied,
`
`and Immersion has not asked me to evaluate these grounds as part of this
`
`declaration. If the Board decides to institute a trial on these grounds, I reserve the
`
`right to provide opinions regarding Petitioner’s grounds 4 and 5.
`
`III. QUALIFICATIONS AND EXPERIENCE
`9.
`I obtained my Ph.D. degree in Electrical and Computer Engineering
`
`from McGill University in 2011. Before that, I received my MA in Physics from
`
`the University of Texas at Austin in 1999, and my BA in Physics from Wesleyan
`
`University in 1995.
`
`10. Since 2015, I have worked as an Assistant Professor at UCSB. From
`
`2013 to 2015, I worked as an Assistant Professor in the Department of Electrical
`
`and Computer Engineering at Drexel University.
`
`11. At UCSB, I lead the RE Touch Lab as its Director and Principal
`
`Investigator. The RE Touch Lab includes six Ph.D. students and numerous
`
`affiliated researchers and undergraduate students. Some of the topics that my
`
`teams at the RE Touch Lab have explored include computational perception, such
`
`as how the mechanical signatures of contact elicit conscious perception of touch,
`
`and the creation of novel haptic devices for simulating the feel of touched objects.
`
`
`
`
`- 3 -
`
`
`
`Immersion Ex 2007-6
`Apple v Immersion
`IPR2017-00896
`
`

`

`12. My personal research focuses on haptic engineering, robotics, and the
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`
`mechanics and neuroscience of touch. My work is motivated by creative
`
`applications in haptic human-computer interaction, sensorimotor augmentation,
`
`and interaction in virtual reality.
`
`13.
`
`In addition to my research at the RE Touch Lab, I also teach classes,
`
`including linear and nonlinear control systems, haptics, human-computer
`
`interaction, interactive arts, artificial intelligence, and robotics.
`
`14.
`
`I am the author of over 60 articles in journals and conference
`
`proceedings. I hold one issued patent, U.S. Patent No. 9,041,521 (“Floor-Based
`
`Haptic Communication System”), and one pending patent application (“Stretchable
`
`Tactile Sensing Array”), both pertaining to haptic technology. I am the editor of
`
`two books on virtual reality, including Human Walking in Virtual Reality. I have
`
`received several awards and honors, including the Google Faculty Research Award
`
`in 2016, and several best paper awards at haptics symposia. I have chaired and
`
`edited several conferences and symposia.
`
`15.
`
`I also have experience working in industry. Before receiving my
`
`Ph.D., I worked for several years as the Principal DSP developer, audio at Ableton,
`
`a renowned music software company. Before that I worked for several years as a
`
`Research Scientist investigating speech recognition at Loquendo Inc., which is
`
`now part of Nuance.
`
`
`
`
`- 4 -
`
`
`
`Immersion Ex 2007-7
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`16. My curriculum vitae is attached as Exhibit 2006.
`
`17.
`
`I am being compensated by Immersion for my time spent in
`
`developing this declaration at a rate of $400 per hour, and for any time spent
`
`testifying in connection with this declaration at a rate of $500 per hour. My
`
`compensation is not contingent upon the substance of my opinions, the content of
`
`this declaration or any testimony I may provide, or the outcome of the inter partes
`
`review or any other proceeding.
`
`18.
`
`I have no financial interest in Immersion, and have financial interests
`
`of less than $3000 in Apple through long-term mutual fund investments
`
`representing less than 1% of my portfolio.
`
`19. My opinions expressed in this declaration are based on the petition
`
`and exhibits cited in the petition, and other documents and materials identified in
`
`this declaration, including the ’571 patent and its prosecution history, the prior art
`
`references and materials discussed in this declaration, and any other references
`
`specifically identified in this declaration.
`
`20.
`
`I am aware of information generally available to, and relied upon by,
`
`persons of ordinary skill in the art at the relevant times, including technical
`
`dictionaries and technical reference materials (including, for example, textbooks,
`
`manuals, technical papers, articles, and relevant technical standards).
`
`
`
`
`- 5 -
`
`
`
`Immersion Ex 2007-8
`Apple v Immersion
`IPR2017-00896
`
`

`

`21.
`
`I reserve the right to supplement my opinions to address any
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`
`information obtained, or positions taken, based on any new information that comes
`
`to light throughout this proceeding.
`
`IV. LEVEL OF ORDINARY SKILL IN THE ART
`22.
`It is my understanding that the ’571 patent should be interpreted based
`
`on how it would be read by a person of ordinary skill in the art at the time of the
`
`effective filing date of the application. It is my understanding that factors such as
`
`the education level of those working in the field, the sophistication of the
`
`technology, the types of problems encountered in the art, the prior art solutions to
`
`those problems, and the speed at which innovations are made may help establish
`
`the level of skill in the art.
`
`23.
`
`I am familiar with the technology at issue and the state of the art at the
`
`earliest priority date of the ’571 patent.
`
`24.
`
`It is my opinion, based upon a review of the ’571 patent, its file
`
`history, and my knowledge of the field of the art, a person of ordinary skill in the
`
`art for the field of the ’571 patent would have at least: (1) a Bachelor's of Science
`
`degree in an engineering discipline such as Mechanical Engineering or Computer
`
`Science, or (2) at least two years' experience working with human machine
`
`interface systems, graphical user interfaces, haptic feedback systems, robotics,
`
`biomechanics, or mobile devices or equivalent embedded systems. A person of
`
`
`
`
`- 6 -
`
`
`
`Immersion Ex 2007-9
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`ordinary skill in the art would also have experience in haptic response technology
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`in multi-touch or multi-gesture systems. This level of skill is commensurate with
`
`the interdisciplinary nature of the ’571 patent, which combines knowledge of
`
`computer software and user interface design with knowledge of electrical and/or
`
`mechanical systems for producing haptic effects.
`
`25.
`
`I have considered the issues discussed in the remainder of this
`
`declaration from this perspective of a person of ordinary skill in the art. Although
`
`I use this perspective, I do not believe that any of my opinions would change if a
`
`slightly higher or lower level of skill were assumed.
`
`V. LEGAL PRINCIPLES
`A. Claim Construction
`26.
`I am not a patent attorney and my opinions are limited to what I
`
`believe a person of ordinary skill in the art would have understood, based on the
`
`patent documents. I use the principles below, however, as a guide in formulating
`
`my opinions.
`
`27. My understanding is that a primary step in determining validity of
`
`patent claims is to properly construe the claims to determine claim scope and
`
`meaning.
`
`28.
`
`In an inter partes review proceeding, as I understand from Immersion
`
`counsel, claims are to be given their broadest reasonable construction (“BRC”) in
`
`
`
`
`- 7 -
`
`
`
`Immersion Ex 2007-10
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`light of the patent’s specification. 37 C.F.R. § 42.100(b). In other forums, such as
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`in federal courts, different standards of proof and claim interpretation control,
`
`which are not applied by the patent office for inter partes review. Accordingly, I
`
`reserve the right to argue for a different interpretation or construction of the
`
`challenged claims in other proceedings, as appropriate.
`
`29.
`
`It is my understanding that in determining whether a patent claim is
`
`anticipated or obvious in view of the prior art, the patent office must construe the
`
`claim by giving the claim its broadest reasonable construction consistent with the
`
`specification. For the purposes of this review, I have construed each claim term in
`
`accordance with its plain and ordinary meaning under the required broadest
`
`reasonable construction.
`
`B. Anticipation
`30.
`It is my understanding that a claim is anticipated under 35 U.S.C. §
`
`102 if each and every element and limitation of the claim is found either expressly
`
`or inherently in a single prior art reference. I understand that anticipation is a
`
`question of fact. I further understand that the requirement of strict identity
`
`between the claim and the reference is not met if a single element or limitation
`
`required by the claim is missing from the applied reference.
`
`
`
`
`- 8 -
`
`
`
`Immersion Ex 2007-11
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`C. Obviousness
`31.
`It is my understanding that a claim is unpatentable under 35 U.S.C. §
`
`103 if the claimed subject matter as a whole would have been obvious to a person
`
`of ordinary skill in the art at the time of the alleged invention. I understand that the
`
`determination of obviousness is made with respect to the subject matter as a whole,
`
`not separate pieces of the claim. I understand that obviousness is a question of law
`
`based on underlying factual issues. I also understand that an obviousness analysis
`
`takes into account the scope and content of the prior art, the differences between
`
`the claimed subject matter and the prior art, the level of ordinary skill in the art at
`
`the time of the invention, and the existence of secondary consideration such as
`
`commercial success or long-felt but unresolved needs.
`
`VI. THE ’571 PATENT
`32.
`I have read and reviewed the ’571 patent and have an understanding
`
`of its background as well as its particular improvements over the prior art. I
`
`understand that the ’571 patent is entitled “Interactivity Model for Shared
`
`Feedback on Mobile Devices.” In my opinion, the ’571 patent is directed to a
`
`novel way of producing haptic effects in electronic devices. A person of ordinary
`
`skill in the art would recognize that a fundamental insight that is described and
`
`claimed in the ’571 patent is that the user’s gestural interactions with the device
`
`need to be tracked and analyzed in order to properly synchronize haptic feedback
`
`
`
`
`- 9 -
`
`
`
`Immersion Ex 2007-12
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`with a user’s input. Reflecting this focus, the claims specify that both a first and a
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`second gesture signal (each based on a user’s gestural inputs) are used to generate
`
`something called a “dynamic interaction parameter.” Ex. 1001 at claim 1
`
`(“receiving a first gesture signal; receiving a second gesture signal; generating a
`
`dynamic interaction parameter using the first gesture signal and the second gesture
`
`signal”). I understand that given the format of the claims, a single gesture signal is
`
`insufficient to form the dynamic interaction parameter.
`
`33.
`
`In my opinion, the dynamic interaction parameter is meant to
`
`accurately and responsively track the user’s behavior. As such, a person of
`
`ordinary skill in the art would understand that the dynamic interaction parameter
`
`changes or reacts in real time to the user’s interactions, and is used to alter the
`
`haptic effects produced by the device. This allows the device to provide
`
`responsive haptic feedback to the user. Ex. 1001 at 1:29-33 (“[V]ibrotactile haptic
`
`effects . . . may be useful in providing cues to users of electronic devices to alert
`
`the user to specific events, or provide realistic feedback to create greater sensory
`
`immersion within a simulated or virtual environment.”). A person of ordinary skill
`
`in the art would recognize that the approach of the ’571 patent is an improvement
`
`over the prior art because the ’571 patent’s techniques can improve the timing
`
`and/or nature of haptic feedback: “[B]ecause these user gestures and system
`
`animations have variable timing, the correlation to haptic feedback [in the prior art]
`
`
`
`
`- 10 -
`
`
`
`Immersion Ex 2007-13
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`may be static and inconsistent and therefore less compelling to the user.” Id. at
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`1:49-56.
`
`34. Other ingredients may be used in addition to a first gesture signal and
`
`a second gesture signal to generate the dynamic interaction parameter. For
`
`example, additional device sensor signals may be used. Id. at claim 7. A person of
`
`ordinary skill in the art would recognize that using these additional ingredients is
`
`another improvement over the prior art. E.g., id. at 1:56-60 (“Further, device
`
`sensor information is typically not used in combination with gestures to produce
`
`haptic feedback.”). The various ingredients may be combined and processed in
`
`several different ways to generate the dynamic interaction parameter. See, e.g., id.
`
`at Table 2 (listing 14 different example “methods of synthesis” that may be
`
`employed). In my opinion, the dependent claims of the ’571 patent show that the
`
`generation of the dynamic interaction parameter using both a first gesture signal
`
`and a second gesture signal, including the selection and processing of the
`
`ingredients, is the inventive focus. A person of ordinary skill in the art reading the
`
`patent would understand that the claims require specific ingredients in specific
`
`numbers to be used to generate the dynamic interaction parameter. E.g., id. at
`
`claim 7 (“receiving a first device sensor signal; receiving a second device sensor
`
`signal; and wherein generating a dynamic interaction parameter comprises
`
`generating a dynamic interaction parameter using the first gesture signal and the
`
`
`
`
`- 11 -
`
`
`
`Immersion Ex 2007-14
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`second gesture signal and the first device sensor signal and the second device
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`sensor signal”).
`
`35. Once the dynamic interaction parameter has been generated using a
`
`first gesture signal, a second gesture signal, and potentially other ingredients, it is
`
`used to modify the haptic output of the system. Id. at 15:8-9 (“At 1313, a drive
`
`signal is applied to a haptic actuator according to the interaction parameter.”); see
`
`also claim 1 (“applying a drive signal to a haptic output device according to the
`
`dynamic interaction parameter”). For example, in one embodiment, a user may
`
`scroll between different film frames on an electronic device with a touchscreen,
`
`and may receive haptic feedback for that interaction. Id. at 13:56-61 (“By using
`
`gestures or device sensor data, a user may scroll the filmstrip from left to right or
`
`right to left, and the filmstrip application may then dynamically provide a haptic
`
`effect for a first photograph 1101 which is different from a haptic effect for a
`
`second photograph 1103 based upon the gestures or device sensor data.”).
`
`VII. CLAIM CONSTRUCTION
`A.
`“gesture signal” (claims 1-7, 23-29)
`36.
`
`In IPR2016-01372, the Board construed gesture signal to mean “a
`
`signal indicating a movement of the body that conveys meaning or user intent.”
`
`IPR2016-01372, Paper 7 at 12. I apply that construction in this declaration.
`
`
`
`
`- 12 -
`
`
`
`Immersion Ex 2007-15
`Apple v Immersion
`IPR2017-00896
`
`

`

`37.
`
`It is also my opinion that the distinction between signals that do and
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`
`do not indicate a movement of the body that conveys meaning or user intent was
`
`also made by the applicant during prosecution. See Ex. 2005 at 9 (August 2, 2012
`
`Applicant Remarks in prosecution of U.S. Patent No. 8,279,193, a prior patent in
`
`the same family as the ’571 patent) (“[Prior art reference raised by the examiner]
`
`Marvit describes gestures in the context of motion sensor engagement for a
`
`handheld device. For example, input movement may be in the form of translation
`
`and/or gestures. Translation-based input focuses on a beginning point and
`
`endpoint of a motion and difference between such beginning points and
`
`endpoints.”). A person of ordinary skill in the art reading this section of the
`
`prosecution history would understand that the applicant’s comments, in addition to
`
`the disclosure of the Marvit reference, show that translation input is not necessarily
`
`a gesture signal because it may not convey the requisite meaning or user intent that
`
`the Board has held must be signified by a “gesture signal.”
`
`B.
`
`“dynamic interaction parameter” (claims 1, 4-7, 12, 15-18, 23, 26-
`29)
`
`38.
`
`In IPR2016-01372, the Board construed this term to mean “a
`
`parameter that changes over time or reacts in real time based on a user’s interaction
`
`with a device.” I apply that construction in this declaration.
`
`
`
`
`- 13 -
`
`
`
`Immersion Ex 2007-16
`Apple v Immersion
`IPR2017-00896
`
`

`

`Case IPR2016-00896
`Patent No. 8,659,571
`
`“vector signal” (claims 2, 13, 24)
`
`It is my opinion that the broadest reasonable construction of “vector
`
`
`
`C.
`39.
`
`signal” is “a signal that includes both a magnitude and direction.” See, e.g.,
`
`Petition at 5. I apply that construction in this declaration.
`
`VIII. GROUND 1: POUPYREV DOES NOT RENDER CLAIMS 1-4, 7, 23-
`26 AND 29 OBVIOUS UNDER 35 U.S.C. § 103(A)
`40.
`
`It is my opinion that Apple has failed to establish that Poupryev,
`
`standing alone, renders these claims obvious for at least the reasons expressed
`
`below.
`
`A.
`41.
`
`Poupyrev does not disclose or render obvious claim 1
`
`In my opinion, Apple has failed to show that Poupyrev renders any
`
`challenged claim of the ’571 patent obvious under pre-AIA 35 U.S.C. § 103(a).
`
`Claim 1 of the ’571 patent requires that both a “first gesture signal” and a “second
`
`gesture signal” be received, and that both of these signals are used in generating
`
`the “dynamic interaction parameter.” Claim 1 reads:
`
`1.
`
`A method of producing a haptic effect comprising:
`receiving a first gesture signal;
`receiving a second gesture signal;
`generating a dynamic interaction parameter using the first
`gesture signal and the second gesture signal; and
`applying a drive signal to a haptic output device according to
`the dynamic interaction parameter.
`
`
`
`
`- 14 -
`
`
`
`Immersion Ex 2007-17
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`Ex. 1001 at claim 1.
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`42.
`
`In my opinion, detection of position and pressure that Apple identifies
`
`as the gesture signals in Poupyrev are not gesture signals because standing alone,
`
`they do not indicate a movement of the body that conveys meaning or user intent.
`
`Instead, the “gesture signal” from which the haptic output is derived is the output
`
`from a GUI controller 112.
`
`43. The system described in Poupyrev for generating a haptic (or tactile)
`
`output to the user is shown in Figure 1, reproduced below. See generally Ex. 1013
`
`at 4:61-7:42.
`
`
`
`
`- 15 -
`
`
`
`Immersion Ex 2007-18
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`
`44. Apple points to the detection of position (in 2D position sensing unit
`
`104) and pressure (in pressure sensing unit 105) as being two different gesture
`
`signals. Pet. at 10.
`
`45.
`
`In my opinion, the detection of position and pressure are each
`
`performed together in Poupyrev to determine, at most, a single gesture, rather than
`
`two different gestures. As such, the position and pressure signals do not each
`
`constitute a gesture signal, because neither of them alone is a signal indicating a
`
`movement of the body that conveys meaning or user intent in Poupyrev's system.
`
`
`
`
`- 16 -
`
`
`
`Immersion Ex 2007-19
`Apple v Immersion
`IPR2017-00896
`
`

`

`46. Contrary to Apple’s view, Poupyrev makes clear that its system’s
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`
`
`Graphical User Interface (“GUI”) controller 112, rather than the position and
`
`pressure sensing units 104 and 105, determines whether a physical user input is a
`
`gesture. See, e.g., Ex. 1013 at 7:14-18 (“The GUI controller 112 determines which
`
`GUI object the user 2 is intending to interact with.”) (emphasis added). The
`
`process by which a gesture is determined in Poupyrev’s system is further
`
`illustrated in Figure 4, reproduced below:
`
`
`
`47.
`
`In this process, Poupyrev’s system determines a position of the user
`
`finger or pen in step 210. Ex. 1013 at 8:25-27. In step 211, the process determines
`
`
`
`
`- 17 -
`
`
`
`Immersion Ex 2007-20
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`whether the position of the finger or pen is inside the boundary of a GUI element.
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`Id. at 8:27-31. If the result of this step is “no,” a gesture has not been performed,
`
`and the process returns to step 210 to again track the position of the user’s finger or
`
`pen. Id. at Fig. 4. In this case, the pressure information is not used in the process
`
`at all, because step 212 is only reached if the position information indicates that the
`
`finger or pen is inside a GUI element. See id.
`
`48.
`
`In Step 212, the process determines if there is a pressing event (i.e.,
`
`whether the user intended to interact with a GUI element). This determination
`
`occurs only after the location sensing unit 104 determines that the user’s finger is
`
`in a particular location and after the pressure sensing unit 105 determines that the
`
`pressure is “more than a predetermined value.” See id. at 8:25-35. Thus, only after
`
`both position and pressure information have been analyzed is it determined that a
`
`user has intended to activate a particular GUI object, and a corresponding haptic
`
`feedback output by the device. See id. at 8:47-49 (“[I]t is possible to let the user 2
`
`know[] that the selected GUI object 203 is activated with the haptic feedback.”).
`
`49.
`
`In the above-described process, neither the pressure detection nor the
`
`position detection contain sufficient information to indicate a movement of the
`
`body that conveys meaning or user intent. Rather, a movement of the body that
`
`conveys meaning or user intent is discernable only after both position and pressure
`
`are considered in conjunction with one another. For example, if a user finger or
`
`
`
`
`- 18 -
`
`
`
`Immersion Ex 2007-21
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`pen is not within a GUI element, no meaning or user intent can be derived from the
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`user input, regardless of the pressure applied. See id. at Fig. 4. Likewise, even if
`
`the user’s finger or pen is located over a GUI element, only if sufficient pressure is
`
`applied will Poupyrev’s system indicate the meaning or user intent to activate that
`
`GUI element. Id. The position or pressure alone are insufficient to indicate a body
`
`movement that conveys meaning or user intent in this process.
`
`50. Accordingly, a POSITA would understand that in Poupyrev's system,
`
`position and pressure information must be considered in tandem to indicate a
`
`movement of the body that conveys meaning or user intent. Each of these signals
`
`individually is not a gesture signal.
`
`51. Poupyrev’s single discussion of the term “gesture” confirms this
`
`understanding. In particular, Poupyrev teaches that even if a user moves the finger
`
`or pen over a GUI object, the “actuation event” does not occur until a certain
`
`pressure is applied to the object. Id. at 10:3-11. Poupyrev defines its “actuation
`
`event” as “some gesture that allows the user to specify that the GUI object 310
`
`should be actuated.” Id. (emphasis added). Poupyrev thus only teaches a single
`
`gesture—the actuation event—that involves both location and pressure together. It
`
`is my opinion that a POSITA would understand, based on this teaching, that a
`
`location signal or a pressure signal, standing alone, would not indicate a movement
`
`of the body that conveys the requisite meaning or user intent to constitute a gesture
`
`
`
`
`- 19 -
`
`
`
`Immersion Ex 2007-22
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`signal in Poupyrev's system.
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`52. This understanding is further consistent with other discussions in
`
`Poupyrev of “interactions” that require complex sequences of pressure and position
`
`over time in order to identify a movement of the body that conveys meaning or
`
`user intent. Ex. 1013 at 7:47-54 (discussing "interactions" in a prior art system).
`
`For instance, one of the interactions involves “drag[ging] a finger across the input
`
`space,” (id.) which could generate dozens, hundreds, or even thousands of updates
`
`from a position-sensing unit. Under Apple’s interpretation, each of these updates
`
`would individually be a gesture signal, even though each location update, taken
`
`individually, does not indicate a movement of the body that conveys meaning or
`
`user intent in Poupyrev's system.
`
`53. Each of the examples cited by Apple do not teach that a pressure or
`
`position reading, by itself, can indicate a movement of the body that conveys
`
`meaning or user intent in Poupyrev's system. For example, Petitioner alleges that
`
`Poupyrev discloses “multiple ‘pressure’ gestures,” which actually depend on both
`
`pressure and position (i.e., whether the finger is “inside of the GUI object”). Pet.
`
`at 14-15. Apple also cites to a “typical touch-screen interaction,” but does not
`
`identify any pressure or position reading that indicates a movement of the body
`
`that conveys meaning or user intent that is used to generate a haptic effect. Id. at
`
`14. In any event, it is unclear why Apple relies on this discussion, as Poupyrev
`
`
`
`
`- 20 -
`
`
`
`Immersion Ex 2007-23
`Apple v Immersion
`IPR2017-00896
`
`

`

`
`makes clear that this “typical touch-screen interaction” is “of related art,” and thus
`
`Case IPR2016-00896
`Patent No. 8,659,571
`
`does not describe how Poupyrev itself is implemented. See Pet. at 14; see also Ex.
`
`1013 at 3:58-59 (“FIG. 2. is a schematic diagram showing an example of
`
`interaction with touch screens of prior art.”). Apple provides no explanation for
`
`why a POSITA would have modified Poupyrev to incorporate the prior art
`
`approach expressly distinguish

This document is available on Docket Alarm but you must sign up to view it.


Or .

Accessing this document will incur an additional charge of $.

After purchase, you can access this document again without charge.

Accept $ Charge
throbber

Still Working On It

This document is taking longer than usual to download. This can happen if we need to contact the court directly to obtain the document and their servers are running slowly.

Give it another minute or two to complete, and then try the refresh button.

throbber

A few More Minutes ... Still Working

It can take up to 5 minutes for us to download a document if the court servers are running slowly.

Thank you for your continued patience.

This document could not be displayed.

We could not find this document within its docket. Please go back to the docket page and check the link. If that does not work, go back to the docket and refresh it to pull the newest information.

Your account does not support viewing this document.

You need a Paid Account to view this document. Click here to change your account type.

Your account does not support viewing this document.

Set your membership status to view this document.

With a Docket Alarm membership, you'll get a whole lot more, including:

  • Up-to-date information for this case.
  • Email alerts whenever there is an update.
  • Full text search for other cases.
  • Get email alerts whenever a new case matches your search.

Become a Member

One Moment Please

The filing “” is large (MB) and is being downloaded.

Please refresh this page in a few minutes to see if the filing has been downloaded. The filing will also be emailed to you when the download completes.

Your document is on its way!

If you do not receive the document in five minutes, contact support at support@docketalarm.com.

Sealed Document

We are unable to display this document, it may be under a court ordered seal.

If you have proper credentials to access the file, you may proceed directly to the court's system using your government issued username and password.


Access Government Site

We are redirecting you
to a mobile optimized page.





Document Unreadable or Corrupt

Refresh this Document
Go to the Docket

We are unable to display this document.

Refresh this Document
Go to the Docket